Feb 25 07:17:25 crc systemd[1]: Starting Kubernetes Kubelet... Feb 25 07:17:25 crc restorecon[4689]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:25 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 25 07:17:26 crc restorecon[4689]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 25 07:17:27 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.042946 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050002 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050034 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050046 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050055 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050065 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050074 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050081 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050089 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050098 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050106 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050114 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050122 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050130 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050137 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050146 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050153 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050161 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050168 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050176 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050183 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050191 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050198 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050206 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050213 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050221 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050242 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050253 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050263 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050272 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050281 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050290 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050299 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050308 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050319 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050330 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050338 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050346 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050354 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050361 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050370 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050377 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050385 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050393 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050401 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050408 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050415 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050423 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050431 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050438 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050446 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050453 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050461 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050471 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050481 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050490 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050498 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050507 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050516 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050525 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050532 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050540 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050547 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050555 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050576 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050585 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050618 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050626 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050634 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.050641 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.051571 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.051589 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052654 4749 flags.go:64] FLAG: --address="0.0.0.0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052680 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052694 4749 flags.go:64] FLAG: --anonymous-auth="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052706 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052717 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052726 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052737 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052748 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052757 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052766 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052780 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052789 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052798 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052807 4749 flags.go:64] FLAG: --cgroup-root="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052816 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052824 4749 flags.go:64] FLAG: --client-ca-file="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052833 4749 flags.go:64] FLAG: --cloud-config="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052842 4749 flags.go:64] FLAG: --cloud-provider="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052850 4749 flags.go:64] FLAG: --cluster-dns="[]" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052889 4749 flags.go:64] FLAG: --cluster-domain="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052911 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052920 4749 flags.go:64] FLAG: --config-dir="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052929 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052938 4749 flags.go:64] FLAG: --container-log-max-files="5" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052949 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052960 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052969 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052978 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052987 4749 flags.go:64] FLAG: --contention-profiling="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.052996 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053005 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053014 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053023 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053034 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053043 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053052 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053060 4749 flags.go:64] FLAG: --enable-load-reader="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053069 4749 flags.go:64] FLAG: --enable-server="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053079 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053090 4749 flags.go:64] FLAG: --event-burst="100" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053099 4749 flags.go:64] FLAG: --event-qps="50" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053107 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053116 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053126 4749 flags.go:64] FLAG: --eviction-hard="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053136 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053144 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053154 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053164 4749 flags.go:64] FLAG: --eviction-soft="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053173 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053182 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053191 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053199 4749 flags.go:64] FLAG: --experimental-mounter-path="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053208 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053227 4749 flags.go:64] FLAG: --fail-swap-on="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053236 4749 flags.go:64] FLAG: --feature-gates="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053247 4749 flags.go:64] FLAG: --file-check-frequency="20s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053257 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053272 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053282 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053291 4749 flags.go:64] FLAG: --healthz-port="10248" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053300 4749 flags.go:64] FLAG: --help="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053309 4749 flags.go:64] FLAG: --hostname-override="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053317 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053326 4749 flags.go:64] FLAG: --http-check-frequency="20s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053336 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053345 4749 flags.go:64] FLAG: --image-credential-provider-config="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053354 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053363 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053371 4749 flags.go:64] FLAG: --image-service-endpoint="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053380 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053389 4749 flags.go:64] FLAG: --kube-api-burst="100" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053398 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053407 4749 flags.go:64] FLAG: --kube-api-qps="50" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053416 4749 flags.go:64] FLAG: --kube-reserved="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053425 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053433 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053443 4749 flags.go:64] FLAG: --kubelet-cgroups="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053451 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053460 4749 flags.go:64] FLAG: --lock-file="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053468 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053478 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053486 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053512 4749 flags.go:64] FLAG: --log-json-split-stream="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053521 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053530 4749 flags.go:64] FLAG: --log-text-split-stream="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053539 4749 flags.go:64] FLAG: --logging-format="text" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053548 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053557 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053566 4749 flags.go:64] FLAG: --manifest-url="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053578 4749 flags.go:64] FLAG: --manifest-url-header="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053589 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053633 4749 flags.go:64] FLAG: --max-open-files="1000000" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053644 4749 flags.go:64] FLAG: --max-pods="110" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053652 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053662 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053671 4749 flags.go:64] FLAG: --memory-manager-policy="None" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053679 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053688 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053697 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053706 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053725 4749 flags.go:64] FLAG: --node-status-max-images="50" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053735 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053744 4749 flags.go:64] FLAG: --oom-score-adj="-999" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053753 4749 flags.go:64] FLAG: --pod-cidr="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053761 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053773 4749 flags.go:64] FLAG: --pod-manifest-path="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053782 4749 flags.go:64] FLAG: --pod-max-pids="-1" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053791 4749 flags.go:64] FLAG: --pods-per-core="0" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053800 4749 flags.go:64] FLAG: --port="10250" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053809 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053818 4749 flags.go:64] FLAG: --provider-id="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053826 4749 flags.go:64] FLAG: --qos-reserved="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053835 4749 flags.go:64] FLAG: --read-only-port="10255" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053844 4749 flags.go:64] FLAG: --register-node="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053853 4749 flags.go:64] FLAG: --register-schedulable="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053862 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053877 4749 flags.go:64] FLAG: --registry-burst="10" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053886 4749 flags.go:64] FLAG: --registry-qps="5" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053895 4749 flags.go:64] FLAG: --reserved-cpus="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053904 4749 flags.go:64] FLAG: --reserved-memory="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053915 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053926 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053938 4749 flags.go:64] FLAG: --rotate-certificates="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053947 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053956 4749 flags.go:64] FLAG: --runonce="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053965 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053974 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.053993 4749 flags.go:64] FLAG: --seccomp-default="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054002 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054011 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054020 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054029 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054038 4749 flags.go:64] FLAG: --storage-driver-password="root" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054047 4749 flags.go:64] FLAG: --storage-driver-secure="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054056 4749 flags.go:64] FLAG: --storage-driver-table="stats" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054065 4749 flags.go:64] FLAG: --storage-driver-user="root" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054074 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054084 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054093 4749 flags.go:64] FLAG: --system-cgroups="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054101 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054115 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054123 4749 flags.go:64] FLAG: --tls-cert-file="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054133 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054144 4749 flags.go:64] FLAG: --tls-min-version="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054153 4749 flags.go:64] FLAG: --tls-private-key-file="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054162 4749 flags.go:64] FLAG: --topology-manager-policy="none" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054171 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054180 4749 flags.go:64] FLAG: --topology-manager-scope="container" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054190 4749 flags.go:64] FLAG: --v="2" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054200 4749 flags.go:64] FLAG: --version="false" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054211 4749 flags.go:64] FLAG: --vmodule="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054221 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.054231 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054443 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054456 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054464 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054472 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054480 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054488 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054496 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054503 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054514 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054524 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054533 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054541 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054550 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054558 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054567 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054576 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054584 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054615 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054623 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054631 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054653 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054661 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054668 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054676 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054684 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054692 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054700 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054708 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054716 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054723 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054731 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054739 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054746 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054754 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054762 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054769 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054777 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054786 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054794 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054801 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054809 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054817 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054825 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054832 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054839 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054847 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054855 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054862 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054870 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054877 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054885 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054893 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054901 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054910 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054921 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054931 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054942 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054952 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054962 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054970 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054979 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054987 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.054995 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055003 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055011 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055019 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055029 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055040 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055048 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055057 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.055065 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.055089 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.068383 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.068436 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068567 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068581 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068590 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068622 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068630 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068639 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068648 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068657 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068665 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068672 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068681 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068688 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068696 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068707 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068719 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068728 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068738 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068747 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068756 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068764 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068773 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068780 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068789 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068796 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068806 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068816 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068824 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068834 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068846 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068854 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068862 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068870 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068879 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068888 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068896 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068904 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068913 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068922 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068931 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068939 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068949 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068960 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068969 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068978 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068987 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.068995 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069005 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069015 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069026 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069034 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069042 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069050 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069058 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069066 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069075 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069083 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069092 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069100 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069110 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069118 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069125 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069133 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069141 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069149 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069158 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069166 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069173 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069181 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069189 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069197 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069204 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.069218 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069458 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069470 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069481 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069492 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069500 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069509 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069517 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069526 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069534 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069543 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069551 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069558 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069566 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069574 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069581 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069589 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069619 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069628 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069636 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069644 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069651 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069659 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069667 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069675 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069682 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069691 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069698 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069706 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069715 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069725 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069734 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069742 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069750 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069757 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069765 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069773 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069781 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069791 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069801 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069809 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069818 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069827 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069836 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069845 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069854 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069862 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069871 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069879 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069887 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069896 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069904 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069914 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069922 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069930 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069940 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069950 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069960 4749 feature_gate.go:330] unrecognized feature gate: Example Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069969 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069977 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069985 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.069992 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070000 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070007 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070015 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070024 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070032 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070040 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070048 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070056 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070063 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.070071 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.070082 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.071257 4749 server.go:940] "Client rotation is on, will bootstrap in background" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.077499 4749 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.082976 4749 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.083107 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.084990 4749 server.go:997] "Starting client certificate rotation" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.085042 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.085507 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.109333 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.112838 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.115845 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.135994 4749 log.go:25] "Validated CRI v1 runtime API" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.174109 4749 log.go:25] "Validated CRI v1 image API" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.176158 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.181327 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-25-07-12-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.181560 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.213123 4749 manager.go:217] Machine: {Timestamp:2026-02-25 07:17:27.209498362 +0000 UTC m=+0.571324432 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5bb6cb94-61d9-4bd6-9698-2d2f87101f9d BootID:1fea3a71-dd7f-464c-9c92-1cf7272a2aba Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4e:d0:5d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4e:d0:5d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0f:ac:d7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:13:12:80 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:14:cd:9b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:af:2e:f7 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:85:8e:5c:ed:eb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:f5:31:8a:75:27 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.213551 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.213779 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.215859 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.216162 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.216218 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.216533 4749 topology_manager.go:138] "Creating topology manager with none policy" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.216550 4749 container_manager_linux.go:303] "Creating device plugin manager" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.217134 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.217175 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.217414 4749 state_mem.go:36] "Initialized new in-memory state store" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.218018 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.224569 4749 kubelet.go:418] "Attempting to sync node with API server" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.224626 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.224652 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.224674 4749 kubelet.go:324] "Adding apiserver pod source" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.224691 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.228627 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.229780 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.231017 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.231198 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.231042 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.231302 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.232185 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233720 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233764 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233789 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233804 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233826 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233839 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233852 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233873 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233904 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233919 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233943 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.233957 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.237498 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.242482 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.245241 4749 server.go:1280] "Started kubelet" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.246395 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.246498 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 25 07:17:27 crc systemd[1]: Started Kubernetes Kubelet. Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.247380 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.249314 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.249380 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.249644 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.249691 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.249732 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.250256 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.250390 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.250705 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.250880 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.251214 4749 server.go:460] "Adding debug handlers to kubelet server" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.251713 4749 factory.go:55] Registering systemd factory Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.251929 4749 factory.go:221] Registration of the systemd container factory successfully Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.252944 4749 factory.go:153] Registering CRI-O factory Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.253099 4749 factory.go:221] Registration of the crio container factory successfully Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.253326 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.253492 4749 factory.go:103] Registering Raw factory Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.253698 4749 manager.go:1196] Started watching for new ooms in manager Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.255426 4749 manager.go:319] Starting recovery of all containers Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.252812 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18976c1c4facdf6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,LastTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.264783 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265052 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265085 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265114 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265139 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265162 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265185 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265208 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265236 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265277 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265304 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265328 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265354 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265383 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265405 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265433 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265456 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265479 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265501 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265524 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265550 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265571 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265630 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265715 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265745 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265773 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265804 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265832 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265869 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265897 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265921 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265948 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.265975 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266001 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266026 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266050 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266074 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266098 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266122 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266147 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266171 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266196 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266218 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266244 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266270 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266299 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266322 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266347 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266372 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266394 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266418 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266455 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266489 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266518 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266544 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266570 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266633 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266666 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266693 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266718 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266745 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266769 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266796 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266822 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266846 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266871 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266895 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266917 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266939 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.266963 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267060 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267086 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267111 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267134 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267159 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267184 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267210 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267233 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267257 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267309 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267333 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267360 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267384 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267411 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267437 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267459 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267481 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267507 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267529 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267552 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267574 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267693 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267721 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267770 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267801 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267826 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267851 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267875 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267899 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267926 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267951 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.267976 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.268001 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.268037 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.268064 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.268092 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.269518 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.270869 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.270907 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.270952 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.270994 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271022 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271059 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271086 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271118 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271143 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271168 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271200 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271228 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271252 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271284 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271311 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271343 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271369 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271393 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271426 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271448 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271479 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271503 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271527 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271558 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271581 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271644 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271670 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271693 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271734 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271793 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271827 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271852 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271894 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271927 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271951 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.271977 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272013 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272036 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272071 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272095 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272120 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272153 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272178 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272211 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272243 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272267 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272299 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272323 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272359 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272381 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272407 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272438 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272462 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272494 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272517 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272540 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272706 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272801 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272834 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272863 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.272956 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273044 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273126 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273237 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273265 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273335 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273417 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273443 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273508 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273531 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273706 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273734 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273753 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273815 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273836 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273892 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273911 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.273931 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274002 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274027 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274085 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274106 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274126 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274182 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274201 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274226 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274276 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274374 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274402 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274465 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274490 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274513 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274560 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274586 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274643 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.274663 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281552 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281777 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281829 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281859 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281884 4749 reconstruct.go:97] "Volume reconstruction finished" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.281900 4749 reconciler.go:26] "Reconciler: start to sync state" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.294293 4749 manager.go:324] Recovery completed Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.310497 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.311859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.311894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.311906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.312810 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.312835 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.312855 4749 state_mem.go:36] "Initialized new in-memory state store" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.317399 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.320940 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.321003 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.321048 4749 kubelet.go:2335] "Starting kubelet main sync loop" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.321119 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 25 07:17:27 crc kubenswrapper[4749]: W0225 07:17:27.321764 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.321835 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.332944 4749 policy_none.go:49] "None policy: Start" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.333861 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.333901 4749 state_mem.go:35] "Initializing new in-memory state store" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.351843 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.392637 4749 manager.go:334] "Starting Device Plugin manager" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.392712 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.392733 4749 server.go:79] "Starting device plugin registration server" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.393275 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.393300 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.394244 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.394482 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.394519 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.402465 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.421914 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.422011 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423259 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423418 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.423466 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424532 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424664 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424697 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.424724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.425980 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426189 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.426994 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427379 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427416 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.427895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.428173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.428234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.428246 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.428260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.428274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.429302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.429364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.429389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.451538 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.484926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.484988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.485816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.493628 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.496219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.496337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.496440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.496580 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.497523 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588471 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.587481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588710 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588748 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.588955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.589088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.589573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.698351 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.700570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.700626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.700642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.700674 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.701105 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.790996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.812858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.814299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.825403 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: I0225 07:17:27.832677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 25 07:17:27 crc kubenswrapper[4749]: E0225 07:17:27.853103 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.022630 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-f2e12102e8226b20b700c08267bd3250b78be2b983d8d0a31f2b3d5f4f3e695d WatchSource:0}: Error finding container f2e12102e8226b20b700c08267bd3250b78be2b983d8d0a31f2b3d5f4f3e695d: Status 404 returned error can't find the container with id f2e12102e8226b20b700c08267bd3250b78be2b983d8d0a31f2b3d5f4f3e695d Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.024994 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-eb19d8df63945025bf59ce296b883c27c567a6c42c773626126c8e97ece829f5 WatchSource:0}: Error finding container eb19d8df63945025bf59ce296b883c27c567a6c42c773626126c8e97ece829f5: Status 404 returned error can't find the container with id eb19d8df63945025bf59ce296b883c27c567a6c42c773626126c8e97ece829f5 Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.027138 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-369af307bb8831723cb12cf5e784224e054e4f2203d182978bee506fc39483b3 WatchSource:0}: Error finding container 369af307bb8831723cb12cf5e784224e054e4f2203d182978bee506fc39483b3: Status 404 returned error can't find the container with id 369af307bb8831723cb12cf5e784224e054e4f2203d182978bee506fc39483b3 Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.028514 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-07f46bd0b2605b3930575daa87583c1c606697e131a7dd23443c27c83f154e6d WatchSource:0}: Error finding container 07f46bd0b2605b3930575daa87583c1c606697e131a7dd23443c27c83f154e6d: Status 404 returned error can't find the container with id 07f46bd0b2605b3930575daa87583c1c606697e131a7dd23443c27c83f154e6d Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.030798 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-bf67359eb55264b03a9aab946869093e1ae288b38af65b6b03857028979e6369 WatchSource:0}: Error finding container bf67359eb55264b03a9aab946869093e1ae288b38af65b6b03857028979e6369: Status 404 returned error can't find the container with id bf67359eb55264b03a9aab946869093e1ae288b38af65b6b03857028979e6369 Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.055982 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.056097 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.058845 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.059030 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.101563 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.102972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.103032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.103051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.103086 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.103632 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.154273 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.154427 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.244126 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:28 crc kubenswrapper[4749]: W0225 07:17:28.285313 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.285444 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.328472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"eb19d8df63945025bf59ce296b883c27c567a6c42c773626126c8e97ece829f5"} Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.329898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bf67359eb55264b03a9aab946869093e1ae288b38af65b6b03857028979e6369"} Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.331095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"369af307bb8831723cb12cf5e784224e054e4f2203d182978bee506fc39483b3"} Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.334195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f2e12102e8226b20b700c08267bd3250b78be2b983d8d0a31f2b3d5f4f3e695d"} Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.335742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07f46bd0b2605b3930575daa87583c1c606697e131a7dd23443c27c83f154e6d"} Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.653843 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.904705 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.906902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.906960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.906979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:28 crc kubenswrapper[4749]: I0225 07:17:28.907018 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:28 crc kubenswrapper[4749]: E0225 07:17:28.907769 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.192086 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 07:17:29 crc kubenswrapper[4749]: E0225 07:17:29.193751 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.243697 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.343646 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.343781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.345940 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f" exitCode=0 Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.346035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.346122 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.350411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.350479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.350517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.353183 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a76f4a50286ff8c66a8d2cf65077f0ee8e9102da6b79aa791b372e6d4d97e60b" exitCode=0 Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.353275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a76f4a50286ff8c66a8d2cf65077f0ee8e9102da6b79aa791b372e6d4d97e60b"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.353313 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.353985 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.355785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.356568 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772" exitCode=0 Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.356738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.356810 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.358729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.358772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.358791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.359834 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1" exitCode=0 Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.359882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1"} Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.360022 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.361631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.361682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:29 crc kubenswrapper[4749]: I0225 07:17:29.361700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.243742 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:30 crc kubenswrapper[4749]: E0225 07:17:30.254840 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Feb 25 07:17:30 crc kubenswrapper[4749]: W0225 07:17:30.331375 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:30 crc kubenswrapper[4749]: E0225 07:17:30.331482 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.364803 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245" exitCode=0 Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.364858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.364974 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.368792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.368831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.368849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.382247 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.382243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.382309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.382324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.385051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.385103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.385114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.387313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.387345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.387405 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.388306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.388328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.388336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.391573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.391620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.391630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.391640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.395995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fa96757af4af0aaac70e84057b3ada8721f0feef8bd00c89cecee27a2490e198"} Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.396115 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.397040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.397074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.397084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.508580 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.510176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.510221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.510237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:30 crc kubenswrapper[4749]: I0225 07:17:30.510264 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:30 crc kubenswrapper[4749]: E0225 07:17:30.510881 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Feb 25 07:17:30 crc kubenswrapper[4749]: W0225 07:17:30.585430 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Feb 25 07:17:30 crc kubenswrapper[4749]: E0225 07:17:30.586364 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.404258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cad267a45a8e696b61c57adbc0d4e3897ded20a959780214b696d5efdfacd5de"} Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.404331 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.405460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.405494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.405511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.407931 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2" exitCode=0 Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408030 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408167 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408266 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2"} Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408173 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.408546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.409923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.410339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.410415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:31 crc kubenswrapper[4749]: I0225 07:17:31.410443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.413968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75"} Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b"} Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414065 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414172 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1"} Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe"} Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.414931 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.419187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.419312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.420071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.420106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.420076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.420137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:32 crc kubenswrapper[4749]: I0225 07:17:32.681351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.350223 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.422550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d"} Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.422649 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.422697 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.424438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.712013 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.713799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.713878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.713899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:33 crc kubenswrapper[4749]: I0225 07:17:33.713937 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.369890 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.370255 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.372120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.372166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.372185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.401480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.425185 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.425297 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.425860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.425947 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.426025 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.427555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:34 crc kubenswrapper[4749]: I0225 07:17:34.964492 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.377834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.427436 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.427926 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.428094 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:35 crc kubenswrapper[4749]: I0225 07:17:35.429929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:36 crc kubenswrapper[4749]: I0225 07:17:36.430578 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:36 crc kubenswrapper[4749]: I0225 07:17:36.431838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:36 crc kubenswrapper[4749]: I0225 07:17:36.431880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:36 crc kubenswrapper[4749]: I0225 07:17:36.431898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:37 crc kubenswrapper[4749]: E0225 07:17:37.402699 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.215784 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.216020 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.217662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.217723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.217743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.224067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.436339 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.437876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.437940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:38 crc kubenswrapper[4749]: I0225 07:17:38.437966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:41 crc kubenswrapper[4749]: W0225 07:17:41.049785 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.049895 4749 trace.go:236] Trace[2053660548]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Feb-2026 07:17:31.048) (total time: 10001ms): Feb 25 07:17:41 crc kubenswrapper[4749]: Trace[2053660548]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:17:41.049) Feb 25 07:17:41 crc kubenswrapper[4749]: Trace[2053660548]: [10.001700181s] [10.001700181s] END Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.049920 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.216473 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.216569 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.244835 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 25 07:17:41 crc kubenswrapper[4749]: W0225 07:17:41.293387 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.293497 4749 trace.go:236] Trace[1733787027]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Feb-2026 07:17:31.291) (total time: 10001ms): Feb 25 07:17:41 crc kubenswrapper[4749]: Trace[1733787027]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:17:41.293) Feb 25 07:17:41 crc kubenswrapper[4749]: Trace[1733787027]: [10.00147358s] [10.00147358s] END Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.293520 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 25 07:17:41 crc kubenswrapper[4749]: W0225 07:17:41.479051 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.479175 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 07:17:41 crc kubenswrapper[4749]: W0225 07:17:41.480278 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.480431 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.495338 4749 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.499462 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.500323 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.18976c1c4facdf6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,LastTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:41 crc kubenswrapper[4749]: E0225 07:17:41.502843 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:41Z is after 2026-02-23T05:33:13Z" node="crc" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.510201 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.510262 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.518794 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.518864 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.848972 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.849224 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.850979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.851027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.851037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:41 crc kubenswrapper[4749]: I0225 07:17:41.900424 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.248415 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:42Z is after 2026-02-23T05:33:13Z Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.452534 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.456275 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cad267a45a8e696b61c57adbc0d4e3897ded20a959780214b696d5efdfacd5de" exitCode=255 Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.456359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cad267a45a8e696b61c57adbc0d4e3897ded20a959780214b696d5efdfacd5de"} Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.456523 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.456565 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.458734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.458864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.458899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.460547 4749 scope.go:117] "RemoveContainer" containerID="cad267a45a8e696b61c57adbc0d4e3897ded20a959780214b696d5efdfacd5de" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.462153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.462219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.462245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:42 crc kubenswrapper[4749]: I0225 07:17:42.498016 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.248356 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:43Z is after 2026-02-23T05:33:13Z Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.461365 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.464158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e"} Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.464271 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.464440 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:43 crc kubenswrapper[4749]: I0225 07:17:43.465946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.247152 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:44Z is after 2026-02-23T05:33:13Z Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.434467 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.469681 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.470275 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.472949 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" exitCode=255 Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.473003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e"} Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.473076 4749 scope.go:117] "RemoveContainer" containerID="cad267a45a8e696b61c57adbc0d4e3897ded20a959780214b696d5efdfacd5de" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.473085 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.475947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.476014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.476038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.477581 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:44 crc kubenswrapper[4749]: E0225 07:17:44.477947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:44 crc kubenswrapper[4749]: I0225 07:17:44.482040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.248153 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:45Z is after 2026-02-23T05:33:13Z Feb 25 07:17:45 crc kubenswrapper[4749]: W0225 07:17:45.374072 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:45Z is after 2026-02-23T05:33:13Z Feb 25 07:17:45 crc kubenswrapper[4749]: E0225 07:17:45.374189 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.482492 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.485678 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.487113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.487199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.487218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:45 crc kubenswrapper[4749]: I0225 07:17:45.488075 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:45 crc kubenswrapper[4749]: E0225 07:17:45.488353 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.247535 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:46Z is after 2026-02-23T05:33:13Z Feb 25 07:17:46 crc kubenswrapper[4749]: W0225 07:17:46.308707 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:46Z is after 2026-02-23T05:33:13Z Feb 25 07:17:46 crc kubenswrapper[4749]: E0225 07:17:46.308843 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:17:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.489009 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.490421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.490492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.490516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:46 crc kubenswrapper[4749]: I0225 07:17:46.491390 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:46 crc kubenswrapper[4749]: E0225 07:17:46.491721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.251446 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:47 crc kubenswrapper[4749]: E0225 07:17:47.402840 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.903894 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.905382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.905433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.905451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:47 crc kubenswrapper[4749]: I0225 07:17:47.905483 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:47 crc kubenswrapper[4749]: E0225 07:17:47.908205 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:17:47 crc kubenswrapper[4749]: E0225 07:17:47.908544 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.251075 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.739760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.740009 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.741465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.741514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.741534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:48 crc kubenswrapper[4749]: I0225 07:17:48.742396 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:48 crc kubenswrapper[4749]: E0225 07:17:48.742743 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:49 crc kubenswrapper[4749]: I0225 07:17:49.251125 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:49 crc kubenswrapper[4749]: I0225 07:17:49.554837 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 25 07:17:49 crc kubenswrapper[4749]: I0225 07:17:49.575982 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 07:17:49 crc kubenswrapper[4749]: W0225 07:17:49.987004 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:49 crc kubenswrapper[4749]: E0225 07:17:49.987081 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 07:17:50 crc kubenswrapper[4749]: W0225 07:17:50.025294 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 25 07:17:50 crc kubenswrapper[4749]: E0225 07:17:50.025392 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 07:17:50 crc kubenswrapper[4749]: I0225 07:17:50.250065 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:51 crc kubenswrapper[4749]: I0225 07:17:51.216168 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 07:17:51 crc kubenswrapper[4749]: I0225 07:17:51.216301 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 07:17:51 crc kubenswrapper[4749]: I0225 07:17:51.247871 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.508146 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c4facdf6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,LastTimestamp:2026-02-25 07:17:27.245193066 +0000 UTC m=+0.607019126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.515875 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.522717 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.527699 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.534120 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c58dc935f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.399314271 +0000 UTC m=+0.761140301,LastTimestamp:2026-02-25 07:17:27.399314271 +0000 UTC m=+0.761140301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.538849 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.423080002 +0000 UTC m=+0.784906032,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.546188 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.423105044 +0000 UTC m=+0.784931074,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.552802 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.423118864 +0000 UTC m=+0.784944894,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.560009 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.424286015 +0000 UTC m=+0.786112045,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.566820 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.424309565 +0000 UTC m=+0.786135595,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.574405 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.424321726 +0000 UTC m=+0.786147756,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.586509 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.424701737 +0000 UTC m=+0.786527767,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.593747 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.424719557 +0000 UTC m=+0.786545587,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.601136 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.424731207 +0000 UTC m=+0.786557237,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.609244 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.425830576 +0000 UTC m=+0.787656606,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.616923 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.425853357 +0000 UTC m=+0.787679387,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.624378 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.425868267 +0000 UTC m=+0.787694297,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.631740 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.4259413 +0000 UTC m=+0.787767330,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.638952 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.425952811 +0000 UTC m=+0.787778841,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.645741 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.425989102 +0000 UTC m=+0.787815132,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.653885 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.426885925 +0000 UTC m=+0.788711955,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.660428 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.426898715 +0000 UTC m=+0.788724745,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.667533 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6ec99\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6ec99 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311912089 +0000 UTC m=+0.673738119,LastTimestamp:2026-02-25 07:17:27.426908596 +0000 UTC m=+0.788734626,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.673962 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a687f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a687f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311886328 +0000 UTC m=+0.673712358,LastTimestamp:2026-02-25 07:17:27.427535533 +0000 UTC m=+0.789361563,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.680509 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18976c1c53a6c5c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18976c1c53a6c5c5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:27.311902149 +0000 UTC m=+0.673728179,LastTimestamp:2026-02-25 07:17:27.427551854 +0000 UTC m=+0.789377884,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.688661 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1c81fa41b2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.089125298 +0000 UTC m=+1.450951348,LastTimestamp:2026-02-25 07:17:28.089125298 +0000 UTC m=+1.450951348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.695210 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1c81fc1891 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.089245841 +0000 UTC m=+1.451071901,LastTimestamp:2026-02-25 07:17:28.089245841 +0000 UTC m=+1.451071901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.703433 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1c8202da8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.089688714 +0000 UTC m=+1.451514744,LastTimestamp:2026-02-25 07:17:28.089688714 +0000 UTC m=+1.451514744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.709845 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1c820513da openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.089834458 +0000 UTC m=+1.451660488,LastTimestamp:2026-02-25 07:17:28.089834458 +0000 UTC m=+1.451660488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.716408 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1c820d4d38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.090373432 +0000 UTC m=+1.452199492,LastTimestamp:2026-02-25 07:17:28.090373432 +0000 UTC m=+1.452199492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.723362 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ca8b7abbe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.739072958 +0000 UTC m=+2.100899018,LastTimestamp:2026-02-25 07:17:28.739072958 +0000 UTC m=+2.100899018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.729747 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1ca8b91515 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.739165461 +0000 UTC m=+2.100991521,LastTimestamp:2026-02-25 07:17:28.739165461 +0000 UTC m=+2.100991521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.737205 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ca8ba9b90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.739265424 +0000 UTC m=+2.101091474,LastTimestamp:2026-02-25 07:17:28.739265424 +0000 UTC m=+2.101091474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.743488 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1ca8d493ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.74096737 +0000 UTC m=+2.102793420,LastTimestamp:2026-02-25 07:17:28.74096737 +0000 UTC m=+2.102793420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.749842 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1ca8f44978 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.743045496 +0000 UTC m=+2.104871556,LastTimestamp:2026-02-25 07:17:28.743045496 +0000 UTC m=+2.104871556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.756237 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ca9c58601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.756758017 +0000 UTC m=+2.118584047,LastTimestamp:2026-02-25 07:17:28.756758017 +0000 UTC m=+2.118584047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.763401 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ca9dc539f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.758252447 +0000 UTC m=+2.120078507,LastTimestamp:2026-02-25 07:17:28.758252447 +0000 UTC m=+2.120078507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.770334 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1ca9dfb423 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.758473763 +0000 UTC m=+2.120299813,LastTimestamp:2026-02-25 07:17:28.758473763 +0000 UTC m=+2.120299813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.777673 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1ca9e252d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.758645457 +0000 UTC m=+2.120471517,LastTimestamp:2026-02-25 07:17:28.758645457 +0000 UTC m=+2.120471517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.785160 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ca9ea11e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.759153122 +0000 UTC m=+2.120979152,LastTimestamp:2026-02-25 07:17:28.759153122 +0000 UTC m=+2.120979152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.792189 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1caa2a4da5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.763362725 +0000 UTC m=+2.125188785,LastTimestamp:2026-02-25 07:17:28.763362725 +0000 UTC m=+2.125188785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.799413 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cbeee2de0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.111743968 +0000 UTC m=+2.473569998,LastTimestamp:2026-02-25 07:17:29.111743968 +0000 UTC m=+2.473569998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.806766 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cbfa42836 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.12367007 +0000 UTC m=+2.485496100,LastTimestamp:2026-02-25 07:17:29.12367007 +0000 UTC m=+2.485496100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.813675 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cbfb7efb5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.124966325 +0000 UTC m=+2.486792355,LastTimestamp:2026-02-25 07:17:29.124966325 +0000 UTC m=+2.486792355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.819326 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ccd56183a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.353435194 +0000 UTC m=+2.715261254,LastTimestamp:2026-02-25 07:17:29.353435194 +0000 UTC m=+2.715261254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.826760 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ccd6792b3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.354580659 +0000 UTC m=+2.716406719,LastTimestamp:2026-02-25 07:17:29.354580659 +0000 UTC m=+2.716406719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.835172 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1ccda497c1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.358579649 +0000 UTC m=+2.720405709,LastTimestamp:2026-02-25 07:17:29.358579649 +0000 UTC m=+2.720405709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.842904 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1ccdc8f2fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.360962298 +0000 UTC m=+2.722788358,LastTimestamp:2026-02-25 07:17:29.360962298 +0000 UTC m=+2.722788358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.850383 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cce5888d0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.370372304 +0000 UTC m=+2.732198364,LastTimestamp:2026-02-25 07:17:29.370372304 +0000 UTC m=+2.732198364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.857166 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cce6ee917 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.371838743 +0000 UTC m=+2.733664773,LastTimestamp:2026-02-25 07:17:29.371838743 +0000 UTC m=+2.733664773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.864997 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ccecb9d1b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.377914139 +0000 UTC m=+2.739740159,LastTimestamp:2026-02-25 07:17:29.377914139 +0000 UTC m=+2.739740159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.872036 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cdcb4decc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.611304652 +0000 UTC m=+2.973130672,LastTimestamp:2026-02-25 07:17:29.611304652 +0000 UTC m=+2.973130672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.878951 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cdcdf76ff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.614096127 +0000 UTC m=+2.975922147,LastTimestamp:2026-02-25 07:17:29.614096127 +0000 UTC m=+2.975922147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.884513 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cdd042bb7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.616501687 +0000 UTC m=+2.978327707,LastTimestamp:2026-02-25 07:17:29.616501687 +0000 UTC m=+2.978327707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.890974 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1cdd082a94 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.61676354 +0000 UTC m=+2.978589550,LastTimestamp:2026-02-25 07:17:29.61676354 +0000 UTC m=+2.978589550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.899515 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1cdd1c11e0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.618067936 +0000 UTC m=+2.979893956,LastTimestamp:2026-02-25 07:17:29.618067936 +0000 UTC m=+2.979893956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.906778 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cddecc25e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.631744606 +0000 UTC m=+2.993570626,LastTimestamp:2026-02-25 07:17:29.631744606 +0000 UTC m=+2.993570626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.913926 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cddf8c6f8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.632532216 +0000 UTC m=+2.994358236,LastTimestamp:2026-02-25 07:17:29.632532216 +0000 UTC m=+2.994358236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.920576 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cde0b52d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.633747671 +0000 UTC m=+2.995573691,LastTimestamp:2026-02-25 07:17:29.633747671 +0000 UTC m=+2.995573691,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.927949 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18976c1cde110614 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.634121236 +0000 UTC m=+2.995947256,LastTimestamp:2026-02-25 07:17:29.634121236 +0000 UTC m=+2.995947256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.935206 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cde154c4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.634401359 +0000 UTC m=+2.996227419,LastTimestamp:2026-02-25 07:17:29.634401359 +0000 UTC m=+2.996227419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.942590 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cde309e93 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.636191891 +0000 UTC m=+2.998017911,LastTimestamp:2026-02-25 07:17:29.636191891 +0000 UTC m=+2.998017911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.950937 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1ce9920fd0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.827127248 +0000 UTC m=+3.188953288,LastTimestamp:2026-02-25 07:17:29.827127248 +0000 UTC m=+3.188953288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.957668 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ce9b65962 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.829505378 +0000 UTC m=+3.191331408,LastTimestamp:2026-02-25 07:17:29.829505378 +0000 UTC m=+3.191331408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.962424 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cea28bf10 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.837002512 +0000 UTC m=+3.198828532,LastTimestamp:2026-02-25 07:17:29.837002512 +0000 UTC m=+3.198828532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.968773 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cea3ca539 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.838306617 +0000 UTC m=+3.200132637,LastTimestamp:2026-02-25 07:17:29.838306617 +0000 UTC m=+3.200132637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.972939 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ceab2af10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.846042384 +0000 UTC m=+3.207868404,LastTimestamp:2026-02-25 07:17:29.846042384 +0000 UTC m=+3.207868404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.979086 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1ceac54823 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.847261219 +0000 UTC m=+3.209087239,LastTimestamp:2026-02-25 07:17:29.847261219 +0000 UTC m=+3.209087239,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.983788 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cf6467e23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.040278563 +0000 UTC m=+3.402104583,LastTimestamp:2026-02-25 07:17:30.040278563 +0000 UTC m=+3.402104583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.989618 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cf66d2f2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.042814254 +0000 UTC m=+3.404640294,LastTimestamp:2026-02-25 07:17:30.042814254 +0000 UTC m=+3.404640294,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.994544 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cf71a43ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.054157295 +0000 UTC m=+3.415983325,LastTimestamp:2026-02-25 07:17:30.054157295 +0000 UTC m=+3.415983325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:51 crc kubenswrapper[4749]: E0225 07:17:51.999819 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1cf72be2ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.055312078 +0000 UTC m=+3.417138098,LastTimestamp:2026-02-25 07:17:30.055312078 +0000 UTC m=+3.417138098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.007357 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18976c1cf74df22c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.057544236 +0000 UTC m=+3.419370266,LastTimestamp:2026-02-25 07:17:30.057544236 +0000 UTC m=+3.419370266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.012224 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1cfc80d59a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.144765338 +0000 UTC m=+3.506591388,LastTimestamp:2026-02-25 07:17:30.144765338 +0000 UTC m=+3.506591388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.019017 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d030fd61d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.254800413 +0000 UTC m=+3.616626433,LastTimestamp:2026-02-25 07:17:30.254800413 +0000 UTC m=+3.616626433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.023691 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d03dad7db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.268104667 +0000 UTC m=+3.629930687,LastTimestamp:2026-02-25 07:17:30.268104667 +0000 UTC m=+3.629930687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.028628 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d03e90fb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.269036469 +0000 UTC m=+3.630862489,LastTimestamp:2026-02-25 07:17:30.269036469 +0000 UTC m=+3.630862489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.032903 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d0a5f695c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.377455964 +0000 UTC m=+3.739282004,LastTimestamp:2026-02-25 07:17:30.377455964 +0000 UTC m=+3.739282004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.035530 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d0fdc9e0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.469547535 +0000 UTC m=+3.831373565,LastTimestamp:2026-02-25 07:17:30.469547535 +0000 UTC m=+3.831373565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.040533 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d1070bbae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.479254446 +0000 UTC m=+3.841080456,LastTimestamp:2026-02-25 07:17:30.479254446 +0000 UTC m=+3.841080456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.046273 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d176334f6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.595808502 +0000 UTC m=+3.957634532,LastTimestamp:2026-02-25 07:17:30.595808502 +0000 UTC m=+3.957634532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.052482 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d186102bb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.612441787 +0000 UTC m=+3.974267807,LastTimestamp:2026-02-25 07:17:30.612441787 +0000 UTC m=+3.974267807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.058871 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d480cbe64 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.412225636 +0000 UTC m=+4.774051656,LastTimestamp:2026-02-25 07:17:31.412225636 +0000 UTC m=+4.774051656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.064764 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d571232c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.664241352 +0000 UTC m=+5.026067382,LastTimestamp:2026-02-25 07:17:31.664241352 +0000 UTC m=+5.026067382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.069901 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d57b6eb81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.675036545 +0000 UTC m=+5.036862565,LastTimestamp:2026-02-25 07:17:31.675036545 +0000 UTC m=+5.036862565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.073895 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d57c672e8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.676054248 +0000 UTC m=+5.037880268,LastTimestamp:2026-02-25 07:17:31.676054248 +0000 UTC m=+5.037880268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.078551 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d6479f26e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.88914443 +0000 UTC m=+5.250970460,LastTimestamp:2026-02-25 07:17:31.88914443 +0000 UTC m=+5.250970460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.083924 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d6544af6d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.902431085 +0000 UTC m=+5.264257145,LastTimestamp:2026-02-25 07:17:31.902431085 +0000 UTC m=+5.264257145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.090881 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d655a88d1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:31.903862993 +0000 UTC m=+5.265689013,LastTimestamp:2026-02-25 07:17:31.903862993 +0000 UTC m=+5.265689013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.099120 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d727406bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.123637439 +0000 UTC m=+5.485463489,LastTimestamp:2026-02-25 07:17:32.123637439 +0000 UTC m=+5.485463489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.104027 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d73288c85 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.135468165 +0000 UTC m=+5.497294225,LastTimestamp:2026-02-25 07:17:32.135468165 +0000 UTC m=+5.497294225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.108531 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d733c25b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.136752561 +0000 UTC m=+5.498578621,LastTimestamp:2026-02-25 07:17:32.136752561 +0000 UTC m=+5.498578621,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.113268 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d7ef3d590 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.333340048 +0000 UTC m=+5.695166068,LastTimestamp:2026-02-25 07:17:32.333340048 +0000 UTC m=+5.695166068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.117744 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d7feaeda0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.3495336 +0000 UTC m=+5.711359630,LastTimestamp:2026-02-25 07:17:32.3495336 +0000 UTC m=+5.711359630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.122319 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d800295df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.351083999 +0000 UTC m=+5.712910019,LastTimestamp:2026-02-25 07:17:32.351083999 +0000 UTC m=+5.712910019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.128891 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d8e56849e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.59146563 +0000 UTC m=+5.953291690,LastTimestamp:2026-02-25 07:17:32.59146563 +0000 UTC m=+5.953291690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.134688 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18976c1d8f83a379 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:32.611199865 +0000 UTC m=+5.973025925,LastTimestamp:2026-02-25 07:17:32.611199865 +0000 UTC m=+5.973025925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.140030 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 07:17:52 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.18976c1f906ecfdb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 07:17:52 crc kubenswrapper[4749]: body: Feb 25 07:17:52 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216546779 +0000 UTC m=+14.578372839,LastTimestamp:2026-02-25 07:17:41.216546779 +0000 UTC m=+14.578372839,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:17:52 crc kubenswrapper[4749]: > Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.144273 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1f9070526d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216645741 +0000 UTC m=+14.578471801,LastTimestamp:2026-02-25 07:17:41.216645741 +0000 UTC m=+14.578471801,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.150641 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 07:17:52 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.18976c1fa1f04673 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 07:17:52 crc kubenswrapper[4749]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 07:17:52 crc kubenswrapper[4749]: Feb 25 07:17:52 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.510243955 +0000 UTC m=+14.872069995,LastTimestamp:2026-02-25 07:17:41.510243955 +0000 UTC m=+14.872069995,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:17:52 crc kubenswrapper[4749]: > Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.156477 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1fa1f0f56b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.510288747 +0000 UTC m=+14.872114777,LastTimestamp:2026-02-25 07:17:41.510288747 +0000 UTC m=+14.872114777,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.161560 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18976c1fa1f04673\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 25 07:17:52 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-apiserver-crc.18976c1fa1f04673 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 25 07:17:52 crc kubenswrapper[4749]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 25 07:17:52 crc kubenswrapper[4749]: Feb 25 07:17:52 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.510243955 +0000 UTC m=+14.872069995,LastTimestamp:2026-02-25 07:17:41.518843248 +0000 UTC m=+14.880669278,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:17:52 crc kubenswrapper[4749]: > Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.166668 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18976c1fa1f0f56b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1fa1f0f56b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.510288747 +0000 UTC m=+14.872114777,LastTimestamp:2026-02-25 07:17:41.51889337 +0000 UTC m=+14.880719400,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.175148 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18976c1d03e90fb5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d03e90fb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.269036469 +0000 UTC m=+3.630862489,LastTimestamp:2026-02-25 07:17:42.46354485 +0000 UTC m=+15.825370900,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.180003 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18976c1d0fdc9e0f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d0fdc9e0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.469547535 +0000 UTC m=+3.831373565,LastTimestamp:2026-02-25 07:17:42.694488984 +0000 UTC m=+16.056315004,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.187258 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.18976c1d1070bbae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18976c1d1070bbae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:30.479254446 +0000 UTC m=+3.841080456,LastTimestamp:2026-02-25 07:17:42.701746075 +0000 UTC m=+16.063572095,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.198812 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1f906ecfdb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 07:17:52 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.18976c1f906ecfdb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 07:17:52 crc kubenswrapper[4749]: body: Feb 25 07:17:52 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216546779 +0000 UTC m=+14.578372839,LastTimestamp:2026-02-25 07:17:51.216271451 +0000 UTC m=+24.578097511,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:17:52 crc kubenswrapper[4749]: > Feb 25 07:17:52 crc kubenswrapper[4749]: E0225 07:17:52.204685 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1f9070526d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1f9070526d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216645741 +0000 UTC m=+14.578471801,LastTimestamp:2026-02-25 07:17:51.216348173 +0000 UTC m=+24.578174223,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:17:52 crc kubenswrapper[4749]: I0225 07:17:52.248368 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:53 crc kubenswrapper[4749]: I0225 07:17:53.249589 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.084538 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.084828 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.086533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.086641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.086671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.087568 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.251075 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.516789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.520046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662"} Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.908577 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.910663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.910728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.910759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:54 crc kubenswrapper[4749]: I0225 07:17:54.910801 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:17:54 crc kubenswrapper[4749]: E0225 07:17:54.916664 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:17:54 crc kubenswrapper[4749]: E0225 07:17:54.917118 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.251092 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.528660 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.530772 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.532806 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" exitCode=255 Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.532862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662"} Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.532903 4749 scope.go:117] "RemoveContainer" containerID="a79441da5c4d2c87de594dc22c45c996f316d7136770f4655934c2e818d5f01e" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.532947 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.533988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.534031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.534049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:55 crc kubenswrapper[4749]: I0225 07:17:55.534953 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:17:55 crc kubenswrapper[4749]: E0225 07:17:55.535231 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.249312 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.538139 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.541352 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.542503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.542644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.542664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:56 crc kubenswrapper[4749]: I0225 07:17:56.543441 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:17:56 crc kubenswrapper[4749]: E0225 07:17:56.543788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:57 crc kubenswrapper[4749]: I0225 07:17:57.249711 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:57 crc kubenswrapper[4749]: E0225 07:17:57.403186 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:17:57 crc kubenswrapper[4749]: W0225 07:17:57.512915 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 25 07:17:57 crc kubenswrapper[4749]: E0225 07:17:57.513007 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.250744 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:17:58 crc kubenswrapper[4749]: W0225 07:17:58.291800 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 25 07:17:58 crc kubenswrapper[4749]: E0225 07:17:58.291860 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.739955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.740274 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.742060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.742145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.742168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:17:58 crc kubenswrapper[4749]: I0225 07:17:58.743300 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:17:58 crc kubenswrapper[4749]: E0225 07:17:58.743645 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:17:59 crc kubenswrapper[4749]: I0225 07:17:59.245644 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.226773 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41786->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.226852 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41786->192.168.126.11:10357: read: connection reset by peer" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.226932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.227168 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.228739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.228794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.228812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.229484 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.229781 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b" gracePeriod=30 Feb 25 07:18:00 crc kubenswrapper[4749]: E0225 07:18:00.234961 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 07:18:00 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.18976c23fd88a581 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:41786->192.168.126.11:10357: read: connection reset by peer Feb 25 07:18:00 crc kubenswrapper[4749]: body: Feb 25 07:18:00 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:18:00.226825601 +0000 UTC m=+33.588651652,LastTimestamp:2026-02-25 07:18:00.226825601 +0000 UTC m=+33.588651652,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:18:00 crc kubenswrapper[4749]: > Feb 25 07:18:00 crc kubenswrapper[4749]: E0225 07:18:00.240405 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c23fd899d53 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41786->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:18:00.226889043 +0000 UTC m=+33.588715103,LastTimestamp:2026-02-25 07:18:00.226889043 +0000 UTC m=+33.588715103,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.247495 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:00 crc kubenswrapper[4749]: E0225 07:18:00.247898 4749 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c23fdb5569a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:18:00.229754522 +0000 UTC m=+33.591580582,LastTimestamp:2026-02-25 07:18:00.229754522 +0000 UTC m=+33.591580582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.555788 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.556468 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b" exitCode=255 Feb 25 07:18:00 crc kubenswrapper[4749]: I0225 07:18:00.556523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b"} Feb 25 07:18:00 crc kubenswrapper[4749]: E0225 07:18:00.763337 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1ca9ea11e2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1ca9ea11e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:28.759153122 +0000 UTC m=+2.120979152,LastTimestamp:2026-02-25 07:18:00.756695416 +0000 UTC m=+34.118521476,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:00 crc kubenswrapper[4749]: E0225 07:18:00.996151 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1cbeee2de0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cbeee2de0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.111743968 +0000 UTC m=+2.473569998,LastTimestamp:2026-02-25 07:18:00.988357556 +0000 UTC m=+34.350183586,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:01 crc kubenswrapper[4749]: E0225 07:18:01.007058 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1cbfa42836\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1cbfa42836 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:29.12367007 +0000 UTC m=+2.485496100,LastTimestamp:2026-02-25 07:18:01.000669087 +0000 UTC m=+34.362495147,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.250163 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.561753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.562205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009"} Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.562352 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.563497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.563554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.563580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.917307 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.918965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.919041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.919066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:01 crc kubenswrapper[4749]: I0225 07:18:01.919104 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:01 crc kubenswrapper[4749]: E0225 07:18:01.924547 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:18:01 crc kubenswrapper[4749]: E0225 07:18:01.924940 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:18:02 crc kubenswrapper[4749]: I0225 07:18:02.249306 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:02 crc kubenswrapper[4749]: I0225 07:18:02.564280 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:02 crc kubenswrapper[4749]: I0225 07:18:02.565248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:02 crc kubenswrapper[4749]: I0225 07:18:02.565271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:02 crc kubenswrapper[4749]: I0225 07:18:02.565281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:03 crc kubenswrapper[4749]: I0225 07:18:03.249551 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.083985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.084265 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.085745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.085799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.085819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.086567 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:18:04 crc kubenswrapper[4749]: E0225 07:18:04.086869 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.250456 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.964972 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.965201 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.966811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.966874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:04 crc kubenswrapper[4749]: I0225 07:18:04.966893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:05 crc kubenswrapper[4749]: I0225 07:18:05.251304 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:06 crc kubenswrapper[4749]: I0225 07:18:06.250722 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:07 crc kubenswrapper[4749]: I0225 07:18:07.251345 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:07 crc kubenswrapper[4749]: E0225 07:18:07.403315 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:18:07 crc kubenswrapper[4749]: W0225 07:18:07.768224 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 25 07:18:07 crc kubenswrapper[4749]: E0225 07:18:07.768315 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.215382 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.215681 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.217319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.217386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.217408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.250119 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.924756 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.926581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.927077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.927242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:08 crc kubenswrapper[4749]: I0225 07:18:08.927402 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:08 crc kubenswrapper[4749]: E0225 07:18:08.933135 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:18:08 crc kubenswrapper[4749]: E0225 07:18:08.933256 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:18:09 crc kubenswrapper[4749]: I0225 07:18:09.249880 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:10 crc kubenswrapper[4749]: I0225 07:18:10.250118 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:11 crc kubenswrapper[4749]: I0225 07:18:11.216347 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 07:18:11 crc kubenswrapper[4749]: I0225 07:18:11.216961 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 25 07:18:11 crc kubenswrapper[4749]: E0225 07:18:11.224354 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1f906ecfdb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 25 07:18:11 crc kubenswrapper[4749]: &Event{ObjectMeta:{kube-controller-manager-crc.18976c1f906ecfdb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 25 07:18:11 crc kubenswrapper[4749]: body: Feb 25 07:18:11 crc kubenswrapper[4749]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216546779 +0000 UTC m=+14.578372839,LastTimestamp:2026-02-25 07:18:11.216923202 +0000 UTC m=+44.578749252,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 25 07:18:11 crc kubenswrapper[4749]: > Feb 25 07:18:11 crc kubenswrapper[4749]: E0225 07:18:11.228740 4749 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18976c1f9070526d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18976c1f9070526d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:17:41.216645741 +0000 UTC m=+14.578471801,LastTimestamp:2026-02-25 07:18:11.217135937 +0000 UTC m=+44.578961997,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:18:11 crc kubenswrapper[4749]: I0225 07:18:11.249562 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:11 crc kubenswrapper[4749]: W0225 07:18:11.382855 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:11 crc kubenswrapper[4749]: E0225 07:18:11.382915 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 07:18:11 crc kubenswrapper[4749]: W0225 07:18:11.850435 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 25 07:18:11 crc kubenswrapper[4749]: E0225 07:18:11.850872 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 25 07:18:12 crc kubenswrapper[4749]: I0225 07:18:12.251581 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:13 crc kubenswrapper[4749]: W0225 07:18:13.029647 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 25 07:18:13 crc kubenswrapper[4749]: E0225 07:18:13.029725 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 25 07:18:13 crc kubenswrapper[4749]: I0225 07:18:13.248829 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:14 crc kubenswrapper[4749]: I0225 07:18:14.250069 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.251463 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.933889 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.935484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.935527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.935544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:15 crc kubenswrapper[4749]: I0225 07:18:15.935578 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:15 crc kubenswrapper[4749]: E0225 07:18:15.941016 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:18:15 crc kubenswrapper[4749]: E0225 07:18:15.941100 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:18:16 crc kubenswrapper[4749]: I0225 07:18:16.251013 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:17 crc kubenswrapper[4749]: I0225 07:18:17.250126 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:17 crc kubenswrapper[4749]: E0225 07:18:17.403589 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.221330 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.221547 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.223215 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.223281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.223305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.227639 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.250065 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.609550 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.610898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.610994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:18 crc kubenswrapper[4749]: I0225 07:18:18.611020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.251573 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.321673 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.323237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.323294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.323317 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.324280 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.614818 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.617909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75"} Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.618066 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.619562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.619606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:19 crc kubenswrapper[4749]: I0225 07:18:19.619617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.246540 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.272707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.272837 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.273741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.273773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.273785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.623251 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.624190 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.626299 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" exitCode=255 Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.626346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75"} Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.626403 4749 scope.go:117] "RemoveContainer" containerID="a405051b1e6a63b7a034bbe2ea0d0656154bad229f1a1e63393cf02cbac5a662" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.626572 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.627434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.627462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.627475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:20 crc kubenswrapper[4749]: I0225 07:18:20.628152 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:20 crc kubenswrapper[4749]: E0225 07:18:20.628341 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:21 crc kubenswrapper[4749]: I0225 07:18:21.248403 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:21 crc kubenswrapper[4749]: I0225 07:18:21.630585 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.246777 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.941828 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.943358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.943444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.943488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:22 crc kubenswrapper[4749]: I0225 07:18:22.943535 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:22 crc kubenswrapper[4749]: E0225 07:18:22.948010 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:18:22 crc kubenswrapper[4749]: E0225 07:18:22.948776 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:18:23 crc kubenswrapper[4749]: I0225 07:18:23.249558 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.084729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.085050 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.086819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.086880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.086903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.087905 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:24 crc kubenswrapper[4749]: E0225 07:18:24.088284 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:24 crc kubenswrapper[4749]: I0225 07:18:24.251249 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:25 crc kubenswrapper[4749]: I0225 07:18:25.251024 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:26 crc kubenswrapper[4749]: I0225 07:18:26.250297 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:27 crc kubenswrapper[4749]: I0225 07:18:27.251187 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:27 crc kubenswrapper[4749]: E0225 07:18:27.404443 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.250192 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.739987 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.740272 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.742056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.742119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.742142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:28 crc kubenswrapper[4749]: I0225 07:18:28.743177 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:28 crc kubenswrapper[4749]: E0225 07:18:28.743579 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.248953 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.949143 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.950814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.950865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.950881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:29 crc kubenswrapper[4749]: I0225 07:18:29.950912 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:29 crc kubenswrapper[4749]: E0225 07:18:29.955955 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 25 07:18:29 crc kubenswrapper[4749]: E0225 07:18:29.956119 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 25 07:18:30 crc kubenswrapper[4749]: I0225 07:18:30.256156 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:31 crc kubenswrapper[4749]: I0225 07:18:31.250847 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:32 crc kubenswrapper[4749]: I0225 07:18:32.248670 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 25 07:18:32 crc kubenswrapper[4749]: I0225 07:18:32.655035 4749 csr.go:261] certificate signing request csr-44d9g is approved, waiting to be issued Feb 25 07:18:32 crc kubenswrapper[4749]: I0225 07:18:32.662970 4749 csr.go:257] certificate signing request csr-44d9g is issued Feb 25 07:18:32 crc kubenswrapper[4749]: I0225 07:18:32.731573 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 25 07:18:33 crc kubenswrapper[4749]: I0225 07:18:33.109399 4749 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 25 07:18:33 crc kubenswrapper[4749]: I0225 07:18:33.664760 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 16:20:24.329474998 +0000 UTC Feb 25 07:18:33 crc kubenswrapper[4749]: I0225 07:18:33.664798 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6801h1m50.664680028s for next certificate rotation Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.854125 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.956293 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.957725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.957801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.957832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.957989 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.965660 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.965916 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 25 07:18:36 crc kubenswrapper[4749]: E0225 07:18:36.965940 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.971329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.971403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.971455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.971475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.971488 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:36Z","lastTransitionTime":"2026-02-25T07:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:36 crc kubenswrapper[4749]: E0225 07:18:36.988181 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.991951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.992060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.992125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.992193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:36 crc kubenswrapper[4749]: I0225 07:18:36.992257 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:36Z","lastTransitionTime":"2026-02-25T07:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.003062 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.006415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.006531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.006633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.006705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.006758 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:37Z","lastTransitionTime":"2026-02-25T07:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.016571 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.020786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.020860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.020874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.020902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.020915 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:37Z","lastTransitionTime":"2026-02-25T07:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.029194 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.029303 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.029328 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.129994 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.230623 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.322216 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.323464 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.323494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:37 crc kubenswrapper[4749]: I0225 07:18:37.323503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.331327 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.405491 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.432209 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.532578 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.633767 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.734590 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.835635 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:37 crc kubenswrapper[4749]: E0225 07:18:37.936087 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.037096 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.137582 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.238018 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: I0225 07:18:38.309504 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.339050 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.440010 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.540466 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.641415 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.741886 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.842312 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:38 crc kubenswrapper[4749]: E0225 07:18:38.943000 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.043132 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.143768 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.244664 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.344767 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.445837 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.546897 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.647774 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.748701 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.849745 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:39 crc kubenswrapper[4749]: E0225 07:18:39.950842 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.051662 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.152012 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.252987 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.353884 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.454781 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.555400 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.656555 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.757069 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.858262 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:40 crc kubenswrapper[4749]: E0225 07:18:40.959021 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.059996 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.160733 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.261415 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.362059 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.462551 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.563779 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.664316 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.764807 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.865746 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:41 crc kubenswrapper[4749]: E0225 07:18:41.966574 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.067648 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.167889 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.268673 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.369495 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.469644 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.570715 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.671796 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.772840 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.873912 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:42 crc kubenswrapper[4749]: E0225 07:18:42.974181 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.074734 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.175072 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.275361 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: I0225 07:18:43.322990 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 25 07:18:43 crc kubenswrapper[4749]: I0225 07:18:43.324762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:43 crc kubenswrapper[4749]: I0225 07:18:43.324893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:43 crc kubenswrapper[4749]: I0225 07:18:43.324923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:43 crc kubenswrapper[4749]: I0225 07:18:43.326230 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.326649 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.376509 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.476881 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.577414 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.678615 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.778770 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.879161 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:43 crc kubenswrapper[4749]: E0225 07:18:43.979508 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.079792 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.180741 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.281677 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.382925 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.483795 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.584632 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.684768 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.785104 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.885440 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:44 crc kubenswrapper[4749]: E0225 07:18:44.986334 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.087032 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.187293 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.287375 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.388006 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.489074 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.589346 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.690073 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.790221 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.891112 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:45 crc kubenswrapper[4749]: E0225 07:18:45.992529 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.093288 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.194479 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.295078 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.396211 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.497074 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.597226 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.698318 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.798858 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:46 crc kubenswrapper[4749]: E0225 07:18:46.899888 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.000926 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.061201 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.067299 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.067707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.067906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.068064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.068197 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:47Z","lastTransitionTime":"2026-02-25T07:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.087792 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.093227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.093265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.093278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.093294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.093312 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:47Z","lastTransitionTime":"2026-02-25T07:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.107030 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.111137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.111169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.111179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.111192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.111201 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:47Z","lastTransitionTime":"2026-02-25T07:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.125337 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.130387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.130443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.130462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.130488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:47 crc kubenswrapper[4749]: I0225 07:18:47.130508 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:47Z","lastTransitionTime":"2026-02-25T07:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.145002 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.145297 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.145339 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.246450 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.347048 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.405848 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.448204 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.548448 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.649648 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.750149 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.850285 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:47 crc kubenswrapper[4749]: E0225 07:18:47.950887 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.051154 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.151460 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.252561 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.352994 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.453670 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.554566 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.655162 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.756165 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.856294 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:48 crc kubenswrapper[4749]: E0225 07:18:48.957096 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.057450 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.157973 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.258853 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.360041 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.461006 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.561260 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.661751 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.762675 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.863109 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:49 crc kubenswrapper[4749]: E0225 07:18:49.963995 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:50 crc kubenswrapper[4749]: E0225 07:18:50.064865 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:50 crc kubenswrapper[4749]: E0225 07:18:50.165557 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:50 crc kubenswrapper[4749]: E0225 07:18:50.265666 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:50 crc kubenswrapper[4749]: E0225 07:18:50.366728 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.437816 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.469300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.469369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.469399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.469438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.469457 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.572655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.572700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.572712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.572751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.572765 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.675165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.675252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.675273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.675296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.675316 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.778625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.778694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.778716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.778744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.778768 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.881991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.882053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.882070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.882137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.882160 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.984720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.984806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.984892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.984927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:50 crc kubenswrapper[4749]: I0225 07:18:50.984951 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:50Z","lastTransitionTime":"2026-02-25T07:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.089130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.089238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.089257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.089282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.089302 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.192068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.192183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.192214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.192247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.192274 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.287561 4749 apiserver.go:52] "Watching apiserver" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.295768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.295827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.295847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.295875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.295893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.298380 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.299158 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qkp9r","openshift-multus/multus-bkmjf","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-node-r9pzm","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s","openshift-machine-config-operator/machine-config-daemon-ljd89","openshift-multus/multus-additional-cni-plugins-tmpqc","openshift-multus/network-metrics-daemon-h66ds","openshift-dns/node-resolver-89w9z","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.299764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.299967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.300081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.300552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.301847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.302213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.302806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.303746 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.304108 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.306986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.307075 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.306990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.307496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.308391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.308531 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.308644 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.308687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.308767 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311388 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.311474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311862 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.312011 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.311862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.313957 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.314256 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.316094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.316343 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.318425 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.318685 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.318857 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319051 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319194 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319261 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319871 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.319928 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320095 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320208 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320368 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320383 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320444 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320660 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320572 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.320721 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.321461 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.337493 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.351523 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.353299 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355826 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355859 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.355987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356170 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356531 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.356982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.357038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.357205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.357328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.357660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.358034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.358463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.358340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.358901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.359194 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.369681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.369705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.369856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.369914 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.369968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.370955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371344 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371591 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371811 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.371917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372051 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372278 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372773 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372805 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372840 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373431 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373759 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373796 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373859 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374218 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374525 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374681 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375169 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.375245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372527 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.372930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.373359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.374690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.375275 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:18:51.875245857 +0000 UTC m=+85.237071907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.391995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392297 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392384 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377127 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377213 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.377982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.378373 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.378864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379380 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.379758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.380817 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.381127 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.381157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.381525 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.381562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.382388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.382458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.382560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.382652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.382903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.383139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.383057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.383297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.383959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.384200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.384218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.384332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.385491 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.386894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387062 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.387901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.388332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.388531 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.389248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.389671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.389696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.390437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.390487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.390683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.392753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.393135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.393330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.393356 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.376754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.395952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.396047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.396073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.396218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.396332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.396834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.397115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.398126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.398498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.398682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.398794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.398978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399912 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.399965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400140 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400190 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400424 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400286 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400641 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400897 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.400989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401233 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.401959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402051 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402501 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.402998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403284 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1183771e-2d52-421f-8c26-0aaff531934a-rootfs\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1183771e-2d52-421f-8c26-0aaff531934a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-hostroot\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4702e42e-e7d8-4126-b968-dabf40d0798f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.403984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-system-cni-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxtjv\" (UniqueName: \"kubernetes.io/projected/3e94f971-4065-492a-822d-39734b6edf77-kube-api-access-hxtjv\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404125 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rltj8\" (UniqueName: \"kubernetes.io/projected/21c23d4e-91a8-4374-84dc-7bdc7450661d-kube-api-access-rltj8\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404234 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404309 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e94f971-4065-492a-822d-39734b6edf77-host\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.405353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.405504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.406090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-kubelet\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.406394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-etc-kubernetes\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.406527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.407263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.407378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw4qz\" (UniqueName: \"kubernetes.io/projected/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-kube-api-access-hw4qz\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.407551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvtjq\" (UniqueName: \"kubernetes.io/projected/1183771e-2d52-421f-8c26-0aaff531934a-kube-api-access-dvtjq\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.404710 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.407823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.407685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408451 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.408952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.409121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.409548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.409827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klvlz\" (UniqueName: \"kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410234 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-os-release\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183771e-2d52-421f-8c26-0aaff531934a-proxy-tls\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410315 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-os-release\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-multus-certs\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qfr\" (UniqueName: \"kubernetes.io/projected/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-kube-api-access-m4qfr\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-netns\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-conf-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-system-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e94f971-4065-492a-822d-39734b6edf77-serviceca\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b170d49d-ba02-4991-8a52-79b7114d6a67-hosts-file\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.410962 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.411004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.411027 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.411258 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.411348 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:51.911319529 +0000 UTC m=+85.273145589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.411370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.411809 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.411042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-cnibin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.412625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.412712 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.412914 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:51.912882519 +0000 UTC m=+85.274708559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.417986 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.418135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.418156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.418281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.418728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.418759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419354 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420382 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-socket-dir-parent\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420490 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-daemon-config\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.419985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cnibin\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.420913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x9w\" (UniqueName: \"kubernetes.io/projected/b170d49d-ba02-4991-8a52-79b7114d6a67-kube-api-access-w8x9w\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-k8s-cni-cncf-io\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-multus\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421673 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.421955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-bin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-cni-binary-copy\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lvt2\" (UniqueName: \"kubernetes.io/projected/4702e42e-e7d8-4126-b968-dabf40d0798f-kube-api-access-2lvt2\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.422340 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.422363 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.422378 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422627 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.422661 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:51.922639348 +0000 UTC m=+85.284465558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422690 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422707 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422718 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422728 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422760 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422771 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422779 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422788 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422797 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422806 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422814 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422839 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422848 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.422858 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423037 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423049 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423059 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423068 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423109 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423118 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423128 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423137 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423146 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423173 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423182 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423190 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423199 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423208 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423217 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423225 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423251 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423260 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423269 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423279 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423288 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423297 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423305 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423340 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423348 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423357 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423365 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423374 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423384 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423392 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423427 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423437 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423447 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423462 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423471 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423497 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423506 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423518 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423527 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423569 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423579 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423588 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423631 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423643 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423655 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423665 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423697 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423708 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423719 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423730 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423740 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423748 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423774 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423783 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423794 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423806 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423817 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423828 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423856 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423868 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423877 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423888 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423900 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423911 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423942 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423951 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423959 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423967 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423976 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.423986 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424015 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424027 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424038 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424049 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424059 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424070 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424101 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424113 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424124 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424136 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424148 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424178 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424192 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424204 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424216 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424227 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424258 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424269 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424280 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424291 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424303 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424338 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424350 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424362 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424373 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424383 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424410 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424419 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424427 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424435 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424444 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424453 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424461 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.424470 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425771 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425781 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425790 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425799 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425810 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425867 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425880 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425892 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425902 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425913 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.425924 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426153 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426166 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426175 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426186 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426195 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426204 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426229 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426241 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426249 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426258 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426267 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426276 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426286 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426392 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426402 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426411 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426420 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426428 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426437 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426445 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426885 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426895 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426921 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426929 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426962 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426971 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426981 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426990 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.426999 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427008 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427032 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427041 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427049 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427057 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427066 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427074 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427083 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427107 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.427117 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.427414 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.427427 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.427438 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.427473 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:51.927460581 +0000 UTC m=+85.289286601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.428746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.428757 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.428787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.428990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.430984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.431035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.431682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.431690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.433342 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.433714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.433778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.433920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.433981 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.434072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.434056 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.434588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.435332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.438149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.439173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.440156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.440714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.445276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.448524 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.453439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.455548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.455936 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.462749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.464999 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.481755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.491937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.503042 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.510033 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.511544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.511659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.511744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.511832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.511927 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.519503 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-netns\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-conf-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qfr\" (UniqueName: \"kubernetes.io/projected/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-kube-api-access-m4qfr\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-system-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e94f971-4065-492a-822d-39734b6edf77-serviceca\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b170d49d-ba02-4991-8a52-79b7114d6a67-hosts-file\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-daemon-config\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cnibin\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x9w\" (UniqueName: \"kubernetes.io/projected/b170d49d-ba02-4991-8a52-79b7114d6a67-kube-api-access-w8x9w\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-cnibin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-socket-dir-parent\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.528963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529135 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-k8s-cni-cncf-io\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-multus\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-cni-binary-copy\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-bin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lvt2\" (UniqueName: \"kubernetes.io/projected/4702e42e-e7d8-4126-b968-dabf40d0798f-kube-api-access-2lvt2\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-hostroot\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1183771e-2d52-421f-8c26-0aaff531934a-rootfs\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1183771e-2d52-421f-8c26-0aaff531934a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4702e42e-e7d8-4126-b968-dabf40d0798f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-system-cni-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxtjv\" (UniqueName: \"kubernetes.io/projected/3e94f971-4065-492a-822d-39734b6edf77-kube-api-access-hxtjv\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529977 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e94f971-4065-492a-822d-39734b6edf77-host\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.529998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-kubelet\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rltj8\" (UniqueName: \"kubernetes.io/projected/21c23d4e-91a8-4374-84dc-7bdc7450661d-kube-api-access-rltj8\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-etc-kubernetes\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw4qz\" (UniqueName: \"kubernetes.io/projected/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-kube-api-access-hw4qz\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b170d49d-ba02-4991-8a52-79b7114d6a67-hosts-file\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvtjq\" (UniqueName: \"kubernetes.io/projected/1183771e-2d52-421f-8c26-0aaff531934a-kube-api-access-dvtjq\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.530983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klvlz\" (UniqueName: \"kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-os-release\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-multus-certs\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-os-release\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183771e-2d52-421f-8c26-0aaff531934a-proxy-tls\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4702e42e-e7d8-4126-b968-dabf40d0798f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-kubelet\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-system-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531670 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.531796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-hostroot\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-system-cni-dir\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-multus\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-conf-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-netns\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-socket-dir-parent\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cnibin\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-cni-dir\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.532611 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.532671 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.032654779 +0000 UTC m=+85.394480809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532830 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-os-release\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-multus-certs\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-run-k8s-cni-cncf-io\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-os-release\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.532970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-host-var-lib-cni-bin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1183771e-2d52-421f-8c26-0aaff531934a-rootfs\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-etc-kubernetes\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e94f971-4065-492a-822d-39734b6edf77-host\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533329 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533350 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533364 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533378 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533397 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533413 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533429 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533444 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533456 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533468 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533479 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533490 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533502 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533514 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533525 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533536 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533549 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533560 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533572 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533584 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533617 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533630 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533641 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533652 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533662 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/21c23d4e-91a8-4374-84dc-7bdc7450661d-cnibin\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.533895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1183771e-2d52-421f-8c26-0aaff531934a-mcd-auth-proxy-config\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.534502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.535845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.535998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-multus-daemon-config\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.536470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e94f971-4065-492a-822d-39734b6edf77-serviceca\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.536561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.536672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/21c23d4e-91a8-4374-84dc-7bdc7450661d-cni-binary-copy\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.536819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.543241 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.544420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4702e42e-e7d8-4126-b968-dabf40d0798f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.545045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.548106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1183771e-2d52-421f-8c26-0aaff531934a-proxy-tls\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.552427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qfr\" (UniqueName: \"kubernetes.io/projected/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-kube-api-access-m4qfr\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.555007 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.557404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvtjq\" (UniqueName: \"kubernetes.io/projected/1183771e-2d52-421f-8c26-0aaff531934a-kube-api-access-dvtjq\") pod \"machine-config-daemon-ljd89\" (UID: \"1183771e-2d52-421f-8c26-0aaff531934a\") " pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.558679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw4qz\" (UniqueName: \"kubernetes.io/projected/4a665cc4-2925-4a4d-bd03-3a05d3dee6da-kube-api-access-hw4qz\") pod \"multus-additional-cni-plugins-tmpqc\" (UID: \"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\") " pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.560559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x9w\" (UniqueName: \"kubernetes.io/projected/b170d49d-ba02-4991-8a52-79b7114d6a67-kube-api-access-w8x9w\") pod \"node-resolver-89w9z\" (UID: \"b170d49d-ba02-4991-8a52-79b7114d6a67\") " pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.562111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxtjv\" (UniqueName: \"kubernetes.io/projected/3e94f971-4065-492a-822d-39734b6edf77-kube-api-access-hxtjv\") pod \"node-ca-qkp9r\" (UID: \"3e94f971-4065-492a-822d-39734b6edf77\") " pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.562752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klvlz\" (UniqueName: \"kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz\") pod \"ovnkube-node-r9pzm\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.563375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lvt2\" (UniqueName: \"kubernetes.io/projected/4702e42e-e7d8-4126-b968-dabf40d0798f-kube-api-access-2lvt2\") pod \"ovnkube-control-plane-749d76644c-mw54s\" (UID: \"4702e42e-e7d8-4126-b968-dabf40d0798f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.563552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rltj8\" (UniqueName: \"kubernetes.io/projected/21c23d4e-91a8-4374-84dc-7bdc7450661d-kube-api-access-rltj8\") pod \"multus-bkmjf\" (UID: \"21c23d4e-91a8-4374-84dc-7bdc7450661d\") " pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.614486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.614823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.614906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.615009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.615097 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.629079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.643115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.657514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.660214 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-26c206c2ee4eb86fb3e497414769e7e354913d206cb7b9f805ec6a31c0f35492 WatchSource:0}: Error finding container 26c206c2ee4eb86fb3e497414769e7e354913d206cb7b9f805ec6a31c0f35492: Status 404 returned error can't find the container with id 26c206c2ee4eb86fb3e497414769e7e354913d206cb7b9f805ec6a31c0f35492 Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.670700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.682453 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-593a170a64043d911bf28401e9c7f060a08dc23101bdef2293d026ef2be89cbf WatchSource:0}: Error finding container 593a170a64043d911bf28401e9c7f060a08dc23101bdef2293d026ef2be89cbf: Status 404 returned error can't find the container with id 593a170a64043d911bf28401e9c7f060a08dc23101bdef2293d026ef2be89cbf Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.696206 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qkp9r" Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.710271 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1183771e_2d52_421f_8c26_0aaff531934a.slice/crio-d23a8e4388dcf8e729efd73f44484f056271baae3b313fef7f5797273f8903db WatchSource:0}: Error finding container d23a8e4388dcf8e729efd73f44484f056271baae3b313fef7f5797273f8903db: Status 404 returned error can't find the container with id d23a8e4388dcf8e729efd73f44484f056271baae3b313fef7f5797273f8903db Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.712544 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.718679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.718719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.718731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.718747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.718759 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.745173 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e94f971_4065_492a_822d_39734b6edf77.slice/crio-0adc758f3b45ae57256448e9b71fc1b473389e3d6bcee0b9e74daefcdf0c5556 WatchSource:0}: Error finding container 0adc758f3b45ae57256448e9b71fc1b473389e3d6bcee0b9e74daefcdf0c5556: Status 404 returned error can't find the container with id 0adc758f3b45ae57256448e9b71fc1b473389e3d6bcee0b9e74daefcdf0c5556 Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.754796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bkmjf" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.760715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.767442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-89w9z" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.773074 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.779134 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a665cc4_2925_4a4d_bd03_3a05d3dee6da.slice/crio-6f339d33bdcfab3e95b61596b3db645b4496ea0d63879f67a1016b311b053961 WatchSource:0}: Error finding container 6f339d33bdcfab3e95b61596b3db645b4496ea0d63879f67a1016b311b053961: Status 404 returned error can't find the container with id 6f339d33bdcfab3e95b61596b3db645b4496ea0d63879f67a1016b311b053961 Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.809764 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c23d4e_91a8_4374_84dc_7bdc7450661d.slice/crio-9a7de54c30db9e79fd3bc5f1edcef46797ba9fae7279dab0458d950cadf8c0d6 WatchSource:0}: Error finding container 9a7de54c30db9e79fd3bc5f1edcef46797ba9fae7279dab0458d950cadf8c0d6: Status 404 returned error can't find the container with id 9a7de54c30db9e79fd3bc5f1edcef46797ba9fae7279dab0458d950cadf8c0d6 Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.822018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.822054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.822065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.822081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.822092 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.838203 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4702e42e_e7d8_4126_b968_dabf40d0798f.slice/crio-be1469ee36fedf31c4b378e5027a7a472639f7194dc4136840d7b83fa6d2912e WatchSource:0}: Error finding container be1469ee36fedf31c4b378e5027a7a472639f7194dc4136840d7b83fa6d2912e: Status 404 returned error can't find the container with id be1469ee36fedf31c4b378e5027a7a472639f7194dc4136840d7b83fa6d2912e Feb 25 07:18:51 crc kubenswrapper[4749]: W0225 07:18:51.840277 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb170d49d_ba02_4991_8a52_79b7114d6a67.slice/crio-05ff4e619ef9f16d168d3abdca234849c2e39a2ef2c327a4445a0ea0a7720620 WatchSource:0}: Error finding container 05ff4e619ef9f16d168d3abdca234849c2e39a2ef2c327a4445a0ea0a7720620: Status 404 returned error can't find the container with id 05ff4e619ef9f16d168d3abdca234849c2e39a2ef2c327a4445a0ea0a7720620 Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.902776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"26c206c2ee4eb86fb3e497414769e7e354913d206cb7b9f805ec6a31c0f35492"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.904126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerStarted","Data":"9a7de54c30db9e79fd3bc5f1edcef46797ba9fae7279dab0458d950cadf8c0d6"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.915078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"929bbb1ef458e727af1ffabf8eceab627e6721fbb331a59810466211fc0cfde8"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.916787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" event={"ID":"4702e42e-e7d8-4126-b968-dabf40d0798f","Type":"ContainerStarted","Data":"be1469ee36fedf31c4b378e5027a7a472639f7194dc4136840d7b83fa6d2912e"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.917493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"26dd9b1ef52806d951dea6a445f2af47160b2cb5cca7faf401a2e09c36176d44"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.918391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"593a170a64043d911bf28401e9c7f060a08dc23101bdef2293d026ef2be89cbf"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.919440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qkp9r" event={"ID":"3e94f971-4065-492a-822d-39734b6edf77","Type":"ContainerStarted","Data":"0adc758f3b45ae57256448e9b71fc1b473389e3d6bcee0b9e74daefcdf0c5556"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.920348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"d23a8e4388dcf8e729efd73f44484f056271baae3b313fef7f5797273f8903db"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.921433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-89w9z" event={"ID":"b170d49d-ba02-4991-8a52-79b7114d6a67","Type":"ContainerStarted","Data":"05ff4e619ef9f16d168d3abdca234849c2e39a2ef2c327a4445a0ea0a7720620"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.922441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerStarted","Data":"6f339d33bdcfab3e95b61596b3db645b4496ea0d63879f67a1016b311b053961"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.923501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.923549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.923562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.923580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.923621 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:51Z","lastTransitionTime":"2026-02-25T07:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.937536 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.937655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937758 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.937720986 +0000 UTC m=+86.299547056 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937765 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.937801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.937851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937859 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.937845219 +0000 UTC m=+86.299671499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: I0225 07:18:51.937904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937929 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937946 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.937958 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938010 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.937994733 +0000 UTC m=+86.299820763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938044 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938068 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.938060435 +0000 UTC m=+86.299886465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938140 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938166 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938183 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:51 crc kubenswrapper[4749]: E0225 07:18:51.938232 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:52.938217129 +0000 UTC m=+86.300043339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.026580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.026637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.026647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.026664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.026676 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.038641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.038820 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.038941 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:53.0389131 +0000 UTC m=+86.400739120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.135011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.135044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.135055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.135072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.135083 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.236966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.237013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.237025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.237038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.237046 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.321895 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.322470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.340812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.340849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.340861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.340878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.340890 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.443303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.443355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.443368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.443385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.443397 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.545176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.545214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.545222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.545235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.545244 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.647057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.647119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.647136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.647159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.647177 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.750405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.750486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.750511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.750545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.750570 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.854474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.854531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.854548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.854569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.854585 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.928263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-89w9z" event={"ID":"b170d49d-ba02-4991-8a52-79b7114d6a67","Type":"ContainerStarted","Data":"863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.931078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" event={"ID":"4702e42e-e7d8-4126-b968-dabf40d0798f","Type":"ContainerStarted","Data":"1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.931149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" event={"ID":"4702e42e-e7d8-4126-b968-dabf40d0798f","Type":"ContainerStarted","Data":"c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.934584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.934655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.936445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerStarted","Data":"c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.938415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qkp9r" event={"ID":"3e94f971-4065-492a-822d-39734b6edf77","Type":"ContainerStarted","Data":"320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.940260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.941990 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2" exitCode=0 Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.942075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.944783 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" exitCode=0 Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.944855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947616 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:18:54.947587794 +0000 UTC m=+88.309413814 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947874 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947888 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.947888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947898 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947998 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.947935 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948027 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948018 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:54.947995975 +0000 UTC m=+88.309822035 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948061 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948084 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948064 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:54.948056827 +0000 UTC m=+88.309882847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948180 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:54.948155659 +0000 UTC m=+88.309981829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:52 crc kubenswrapper[4749]: E0225 07:18:52.948213 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:54.9481953 +0000 UTC m=+88.310021540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.954840 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.957470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.957542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.957560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.957585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.957634 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:52Z","lastTransitionTime":"2026-02-25T07:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.974099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:52 crc kubenswrapper[4749]: I0225 07:18:52.989720 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.008930 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.025101 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.039011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.049662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:53 crc kubenswrapper[4749]: E0225 07:18:53.050566 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:53 crc kubenswrapper[4749]: E0225 07:18:53.051033 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:55.051005156 +0000 UTC m=+88.412831216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.054149 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.061890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.061940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.061955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.061974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.061986 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.065299 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.082498 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.095087 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.109988 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.131445 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.147665 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.163561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.163604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.163616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.163631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.163644 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.166074 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.175946 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.188883 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.203904 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.214759 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.228327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.243747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.255702 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.265230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.265258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.265266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.265280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.265289 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.271344 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.281441 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.290308 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.301420 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.317286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.321429 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:53 crc kubenswrapper[4749]: E0225 07:18:53.321524 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.321625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.321650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:53 crc kubenswrapper[4749]: E0225 07:18:53.321774 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:18:53 crc kubenswrapper[4749]: E0225 07:18:53.321826 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.328725 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.329828 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.330998 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.333846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.335236 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.337452 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.338875 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.339701 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.341738 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.343197 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.345417 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.346720 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.359861 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.363030 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.363835 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.364507 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.365753 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.366879 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.367553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.367602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.367616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.367633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.367644 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.368164 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.368801 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.369558 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.371062 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.371875 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.372542 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.373605 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.374345 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.375305 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.376042 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.379452 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.380196 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.381039 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.382105 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.382725 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.382865 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.389261 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.390110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.390612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.394417 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.395376 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.396040 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.396713 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.397404 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.397914 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.398491 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.399143 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.400919 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.401516 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.402455 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.404232 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.405534 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.406121 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.406652 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.407643 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.408280 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.409437 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.409924 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.471144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.471174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.471184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.471199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.471209 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.573181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.573471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.573479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.573492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.573508 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.676309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.676349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.676361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.676379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.676391 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.779723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.779767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.779785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.779806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.779825 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.882078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.882113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.882123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.882137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.882148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.956582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.956675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.956694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.956711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.956728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.960185 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0" exitCode=0 Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.961629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.984844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.985223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.985246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.985281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.985299 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:53Z","lastTransitionTime":"2026-02-25T07:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:53 crc kubenswrapper[4749]: I0225 07:18:53.990099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.006733 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.033248 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.047364 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.065807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.079449 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.088000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.088035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.088046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.088062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.088070 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.100271 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.112740 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.122755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.142294 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.158895 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.175135 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.191860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.192506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.193258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.193372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.193512 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.193618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.204798 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.296092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.296133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.296146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.296166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.296179 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.322031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.322159 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.358633 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.358709 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.359253 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.398819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.398854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.398869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.398886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.398898 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.501248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.501295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.501306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.501330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.501342 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.603928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.603974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.603987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.604006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.604018 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.706332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.706368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.706381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.706395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.706404 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.809097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.809137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.809148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.809165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.809177 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.912333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.912403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.912425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.912453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.912477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:54Z","lastTransitionTime":"2026-02-25T07:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.969231 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68" exitCode=0 Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.969372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.969664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.969827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.969914 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.969986 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:18:58.969953459 +0000 UTC m=+92.331779479 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.970020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.970058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970088 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:58.970052651 +0000 UTC m=+92.331878821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.970148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970165 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970266 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970220 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970291 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970211 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970358 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:58.970330508 +0000 UTC m=+92.332156648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970370 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970389 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970394 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:58.970376319 +0000 UTC m=+92.332202379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.970444 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:58.970423 +0000 UTC m=+92.332249170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.975690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.977721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7"} Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.978272 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:18:54 crc kubenswrapper[4749]: E0225 07:18:54.978454 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.983064 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:54 crc kubenswrapper[4749]: I0225 07:18:54.997797 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:54Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.012889 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.014790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.014856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.014881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.014910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.014937 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.025705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.043999 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.057551 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.071570 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.072339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:55 crc kubenswrapper[4749]: E0225 07:18:55.072625 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:55 crc kubenswrapper[4749]: E0225 07:18:55.072732 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:18:59.072701583 +0000 UTC m=+92.434527633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.103021 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.119408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.119462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.119479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.119500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.119514 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.120655 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.138143 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.155341 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.166023 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.183144 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.201668 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.216785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.224335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.224391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.224405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.224424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.224441 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.233244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.246146 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.257126 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.266080 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.278903 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.294698 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.316795 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.321310 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.321347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:55 crc kubenswrapper[4749]: E0225 07:18:55.321469 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.321500 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:55 crc kubenswrapper[4749]: E0225 07:18:55.321610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:18:55 crc kubenswrapper[4749]: E0225 07:18:55.321695 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.326369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.326412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.326428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.326447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.326462 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.331486 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.342846 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.353689 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.364226 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.375486 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.390212 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.408411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.425253 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:55Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.428823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.428871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.428885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.428903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.428913 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.532205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.532278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.532306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.532372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.532396 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.734027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.734070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.734084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.734101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.734112 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.836295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.836329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.836337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.836349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.836358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.939349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.939389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.939400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.939415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.939425 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:55Z","lastTransitionTime":"2026-02-25T07:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.984242 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c" exitCode=0 Feb 25 07:18:55 crc kubenswrapper[4749]: I0225 07:18:55.984315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.009419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.025281 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.035379 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.041996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.042035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.042046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.042059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.042069 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.056666 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.074255 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.099553 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.110722 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.122905 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.137641 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.144110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.144145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.144154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.144168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.144177 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.152290 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.168120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.184197 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.198178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.215838 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.226778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:56Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.247059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.247093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.247105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.247122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.247134 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.321744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:56 crc kubenswrapper[4749]: E0225 07:18:56.321911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.349901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.349936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.349944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.349958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.349968 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.454000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.454067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.454085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.454112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.454130 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.556010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.556072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.556089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.556113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.556130 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.658918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.658958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.658969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.658985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.658997 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.762063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.762117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.762133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.762153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.762168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.864280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.864541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.864838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.865107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.865342 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.969545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.969910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.970068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.970233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.970378 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:56Z","lastTransitionTime":"2026-02-25T07:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.994052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:18:56 crc kubenswrapper[4749]: I0225 07:18:56.999144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerStarted","Data":"dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.064799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.073297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.073348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.073365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.073390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.073419 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.090316 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.111082 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.125767 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.141525 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.156915 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.176043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.176406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.176524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.176641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.176745 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.177300 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.192823 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.207823 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.222839 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.236465 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.254836 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.269417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.279417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.279437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.279445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.279459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.279467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.288389 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.317686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.321978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.322150 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.322204 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.322350 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.322547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.322943 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.345163 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.355403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.355461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.355478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.355502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.355519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.370555 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.380950 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.383518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.383547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.383557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.383573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.383584 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.395901 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.399882 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.403974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.404013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.404024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.404040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.404050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.418069 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.423558 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.428679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.428717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.428731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.428748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.428762 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.448326 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.452962 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.454823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.454895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.454909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.454931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.454946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.466817 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.477894 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: E0225 07:18:57.478188 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482049 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.482235 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.503559 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.522738 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.543034 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.554316 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593237 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.593232 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.608363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.626841 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.649294 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.695539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.695624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.695643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.695668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.695687 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.799068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.799111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.799119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.799131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.799149 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.902526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.902578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.902621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.902645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:57 crc kubenswrapper[4749]: I0225 07:18:57.902662 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:57Z","lastTransitionTime":"2026-02-25T07:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.005630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.005678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.006121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.006249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.006280 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.008008 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f" exitCode=0 Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.008064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.027570 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.044315 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.066520 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.084563 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.099966 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.109476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.109537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.109681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.109710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.109729 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.114388 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.128827 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.158154 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.179204 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.197416 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215109 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.215905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.228275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.245048 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.267259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.289335 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:58Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.321411 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.321638 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.323747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.323808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.323827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.323852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.323870 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.426728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.426983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.427063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.427124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.427192 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.530456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.530495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.530505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.530520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.530530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.632294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.632330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.632338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.632352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.632360 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.740351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.740689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.740699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.740712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.740722 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.843097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.843166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.843189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.843219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.843241 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.946802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.946863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.946881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.946904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.946921 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:58Z","lastTransitionTime":"2026-02-25T07:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.986470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.986777 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:19:06.986738433 +0000 UTC m=+100.348564493 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.986869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.986919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987106 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987123 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987132 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987152 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987163 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987172 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987234 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:06.987217795 +0000 UTC m=+100.349043855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987263 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:06.987249985 +0000 UTC m=+100.349076045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987308 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987431 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:06.987401579 +0000 UTC m=+100.349227639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.987570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:58 crc kubenswrapper[4749]: I0225 07:18:58.987747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.987891 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:58 crc kubenswrapper[4749]: E0225 07:18:58.988089 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:06.988062877 +0000 UTC m=+100.349888947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.023253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.023830 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.023883 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.023900 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.031883 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a665cc4-2925-4a4d-bd03-3a05d3dee6da" containerID="7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133" exitCode=0 Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.031942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerDied","Data":"7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.046078 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.053353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.053420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.053438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.053463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.053484 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.067839 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.068230 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.071009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.082467 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.089228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:59 crc kubenswrapper[4749]: E0225 07:18:59.089533 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:59 crc kubenswrapper[4749]: E0225 07:18:59.089694 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:07.089654251 +0000 UTC m=+100.451480311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.093363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.108185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.120930 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.139814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.152386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.156482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.156540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.156562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.156628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.156656 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.167704 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.182240 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.194414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.209754 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.225908 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.240974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.260011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.260350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.260370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.260394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.260415 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.262001 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.274214 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.288936 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.301458 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.314168 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.321517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:18:59 crc kubenswrapper[4749]: E0225 07:18:59.321629 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.321912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:18:59 crc kubenswrapper[4749]: E0225 07:18:59.321978 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.321901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:18:59 crc kubenswrapper[4749]: E0225 07:18:59.322038 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.330893 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.350278 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.363732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.363790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.363807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.363829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.363848 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.370654 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.386847 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.400745 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.417734 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.436769 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.448641 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.458072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.466531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.466567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.466576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.466615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.466627 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.470441 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.487286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:18:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.568937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.568978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.568994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.569017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.569034 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.672347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.672392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.672408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.672430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.672445 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.775449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.775493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.775543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.775564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.775577 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.878339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.878412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.878440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.878475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.878503 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.980796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.980823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.980832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.980845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:18:59 crc kubenswrapper[4749]: I0225 07:18:59.980855 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:18:59Z","lastTransitionTime":"2026-02-25T07:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.042196 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" event={"ID":"4a665cc4-2925-4a4d-bd03-3a05d3dee6da","Type":"ContainerStarted","Data":"8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.066107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.083420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.083487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.083507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.083532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.083551 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.096250 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.118435 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.135022 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.153174 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.168002 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186304 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.186956 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.212102 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.229657 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.242508 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.246357 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.262297 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.280842 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.288748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.288808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.288826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.288850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.288870 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.300585 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.316745 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.321692 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:00 crc kubenswrapper[4749]: E0225 07:19:00.321772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.334867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:00Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.391907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.391993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.392018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.392050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.392080 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.493805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.493846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.493860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.493876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.493888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.597058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.597113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.597129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.597149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.597164 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.699711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.699754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.699767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.699787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.699813 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.802189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.802243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.802285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.802325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.802349 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.904863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.904913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.904930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.904952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:00 crc kubenswrapper[4749]: I0225 07:19:00.904970 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:00Z","lastTransitionTime":"2026-02-25T07:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.008967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.009017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.009028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.009046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.009058 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.112624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.112723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.112736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.112759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.112772 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.216251 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.216310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.216331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.216362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.216386 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.319401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.319474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.319486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.319504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.319518 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.321836 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.321909 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.321834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:01 crc kubenswrapper[4749]: E0225 07:19:01.321963 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:01 crc kubenswrapper[4749]: E0225 07:19:01.322069 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:01 crc kubenswrapper[4749]: E0225 07:19:01.322295 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.422190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.422236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.422245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.422260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.422269 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.526242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.526308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.526325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.526351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.526368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.628743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.628779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.628789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.628805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.628814 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.770497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.770563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.770575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.770615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.770631 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.874919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.874974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.874990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.875014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.875032 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.977114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.977177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.977189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.977231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:01 crc kubenswrapper[4749]: I0225 07:19:01.977256 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:01Z","lastTransitionTime":"2026-02-25T07:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.079169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.079204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.079213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.079244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.079257 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.182248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.182318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.182337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.182365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.182383 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.285857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.285926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.285944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.285977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.285996 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.322211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:02 crc kubenswrapper[4749]: E0225 07:19:02.322406 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.388907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.388978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.388995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.389018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.389037 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.493322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.493394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.493417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.493442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.493459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.597760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.597833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.597849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.597875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.597893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.701411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.701490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.701514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.701545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.701564 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.804712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.804785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.804815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.804841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.804864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.908085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.908147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.908164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.908188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:02 crc kubenswrapper[4749]: I0225 07:19:02.908212 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:02Z","lastTransitionTime":"2026-02-25T07:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.012122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.012560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.012579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.012647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.012673 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.055635 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/0.log" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.058943 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821" exitCode=1 Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.058981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.059843 4749 scope.go:117] "RemoveContainer" containerID="a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.076769 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.100404 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.117287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.117366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.117393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.117422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.117445 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.118178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.130348 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.177218 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.195652 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.211558 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.220044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.220098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.220115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.220139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.220156 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.244854 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:02Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI0225 07:19:02.163414 6596 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 07:19:02.163435 6596 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 07:19:02.163449 6596 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:02.163491 6596 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163507 6596 factory.go:656] Stopping watch factory\\\\nI0225 07:19:02.163362 6596 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163391 6596 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0225 07:19:02.163654 6596 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 07:19:02.163701 6596 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:02.163833 6596 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164194 6596 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164246 6596 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.262455 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.280932 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.300377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.314326 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.321653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.321681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.321733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:03 crc kubenswrapper[4749]: E0225 07:19:03.321821 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:03 crc kubenswrapper[4749]: E0225 07:19:03.321954 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:03 crc kubenswrapper[4749]: E0225 07:19:03.322260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.323686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.323715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.323725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.323741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.323751 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.335383 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.349073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.362542 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:03Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.427033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.427086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.427103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.427125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.427142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.530762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.530824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.530841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.530865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.530884 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.634436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.634497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.634513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.634536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.634555 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.737885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.737934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.737950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.737975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.737994 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.841995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.842069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.842092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.842124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.842148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.946564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.946666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.946684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.946710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:03 crc kubenswrapper[4749]: I0225 07:19:03.946729 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:03Z","lastTransitionTime":"2026-02-25T07:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.049572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.049666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.049685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.049713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.049731 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.152489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.152568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.152591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.152651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.152674 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.255638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.255713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.255737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.255763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.255779 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.321248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:04 crc kubenswrapper[4749]: E0225 07:19:04.321432 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.359170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.359246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.359263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.359285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.359307 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.464180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.464241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.464259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.464330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.464348 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.568070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.568137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.568161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.568185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.568202 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.671326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.671377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.671428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.671460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.671482 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.774288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.774767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.774794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.774816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.774831 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.877935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.877997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.878014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.878037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.878053 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.987688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.987783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.987811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.987844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:04 crc kubenswrapper[4749]: I0225 07:19:04.987868 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:04Z","lastTransitionTime":"2026-02-25T07:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.069961 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/0.log" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.073473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.074119 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.091434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.091512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.091536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.091567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.091626 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.094919 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.114298 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.128700 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.151182 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.172320 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.194844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.194881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.194895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.194912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.194925 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.202235 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.218741 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.251056 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:02Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI0225 07:19:02.163414 6596 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 07:19:02.163435 6596 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 07:19:02.163449 6596 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:02.163491 6596 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163507 6596 factory.go:656] Stopping watch factory\\\\nI0225 07:19:02.163362 6596 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163391 6596 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0225 07:19:02.163654 6596 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 07:19:02.163701 6596 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:02.163833 6596 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164194 6596 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164246 6596 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.268578 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.282226 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.294997 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.296851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.296895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.296907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.296929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.296942 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.307202 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.319516 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.321627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.321627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:05 crc kubenswrapper[4749]: E0225 07:19:05.321752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:05 crc kubenswrapper[4749]: E0225 07:19:05.321843 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.323112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:05 crc kubenswrapper[4749]: E0225 07:19:05.323485 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.331419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.350677 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:05Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.399325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.399516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.399576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.399667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.399729 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.502183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.502398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.502459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.502517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.502578 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.606191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.606246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.606260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.606284 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.606300 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.709950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.710033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.710062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.710095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.710120 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.813204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.813286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.813309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.813340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.813363 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.915962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.916007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.916018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.916033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:05 crc kubenswrapper[4749]: I0225 07:19:05.916045 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:05Z","lastTransitionTime":"2026-02-25T07:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.018429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.018513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.018539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.018570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.018624 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.082411 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/1.log" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.083044 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/0.log" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.086667 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549" exitCode=1 Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.086712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.086750 4749 scope.go:117] "RemoveContainer" containerID="a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.087910 4749 scope.go:117] "RemoveContainer" containerID="3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549" Feb 25 07:19:06 crc kubenswrapper[4749]: E0225 07:19:06.088322 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.107226 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.121113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.121153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.121165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.121182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.121193 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.124214 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.137249 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.151049 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.166062 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.182680 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.197042 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.216218 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a28858faebe7c85dd67e6311758fc84ebac9b06bd978020fe069a9a0afee0821\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:02Z\\\",\\\"message\\\":\\\"9 for removal\\\\nI0225 07:19:02.163414 6596 handler.go:208] Removed *v1.Node event handler 7\\\\nI0225 07:19:02.163435 6596 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0225 07:19:02.163449 6596 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:02.163491 6596 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163507 6596 factory.go:656] Stopping watch factory\\\\nI0225 07:19:02.163362 6596 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0225 07:19:02.163391 6596 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0225 07:19:02.163654 6596 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0225 07:19:02.163701 6596 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:02.163833 6596 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164194 6596 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:02.164246 6596 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.226141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.226188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.226199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.226216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.226228 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.236101 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.251892 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.265943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.276467 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.288692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.302701 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.319744 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:06Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.321848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:06 crc kubenswrapper[4749]: E0225 07:19:06.321979 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.328763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.328820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.328841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.328866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.328884 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.432280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.432360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.432390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.432420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.432442 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.535298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.535387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.535405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.535444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.535462 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.638484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.638557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.638574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.638624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.638645 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.741067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.741137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.741155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.741181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.741201 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.843997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.844076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.844098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.844127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.844149 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.946961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.947010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.947022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.947037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:06 crc kubenswrapper[4749]: I0225 07:19:06.947047 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:06Z","lastTransitionTime":"2026-02-25T07:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.050566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.051002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.051017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.051035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.051049 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.082756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.082961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.082998 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.0829706 +0000 UTC m=+116.444796620 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.083043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083114 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.083146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083185 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.083162645 +0000 UTC m=+116.444988695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.083219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083291 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083348 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083366 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.08334572 +0000 UTC m=+116.445171780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083368 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083305 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083392 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083411 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083433 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.083420202 +0000 UTC m=+116.445246252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083434 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.083494 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.083475733 +0000 UTC m=+116.445301883 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.093915 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/1.log" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.098139 4749 scope.go:117] "RemoveContainer" containerID="3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.098391 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.115358 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.132052 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.146212 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.153274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.153319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.153337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.153366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.153385 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.158652 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.179178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.184915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.185147 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.185951 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:23.185916851 +0000 UTC m=+116.547742981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.196119 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.217786 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.236838 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.253573 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.255903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.255934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.255974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.255993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.256005 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.269892 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.285904 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.299651 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.313929 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.322991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.323060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.323002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.323229 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.323465 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.323630 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.326237 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.348775 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.359286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.359349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.359371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.359402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.359429 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.368792 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.383353 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.399417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.421205 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.442351 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.460798 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.463282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.463359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.463387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.463417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.463442 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.477545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.493310 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.509363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.523204 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.536967 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.549278 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.560570 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.567873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.568169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.568270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.568383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.568466 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.576495 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.598353 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.672059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.672184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.672206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.672231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.672248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.684691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.684781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.684833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.684856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.684873 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.706846 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.712012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.712089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.712110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.712131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.712182 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.732730 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.738201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.738290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.738344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.738367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.738385 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.759204 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.765326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.765411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.765435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.765466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.765489 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.785220 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.790669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.790723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.790740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.790764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.790781 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.812273 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:07Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:07 crc kubenswrapper[4749]: E0225 07:19:07.812523 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.814578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.814683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.814712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.814742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.814766 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.917634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.917695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.917706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.917722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:07 crc kubenswrapper[4749]: I0225 07:19:07.917732 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:07Z","lastTransitionTime":"2026-02-25T07:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.020290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.020348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.020364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.020389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.020407 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.123120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.123181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.123197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.123222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.123240 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.226770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.226840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.226857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.226881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.226898 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.322100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:08 crc kubenswrapper[4749]: E0225 07:19:08.322278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.322947 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.332116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.332291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.332438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.332580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.332743 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.435300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.435334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.435341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.435377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.435387 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.537777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.537814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.537826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.537844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.537857 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.641000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.641050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.641068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.641091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.641110 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.744661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.744980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.745113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.745242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.745361 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.848803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.848870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.848893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.848919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.848937 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.952090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.952505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.952646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.952911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:08 crc kubenswrapper[4749]: I0225 07:19:08.953114 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:08Z","lastTransitionTime":"2026-02-25T07:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.059453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.060172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.060390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.060553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.060736 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.144454 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.147660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.148484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.163663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.163723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.163739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.163758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.163774 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.171666 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.186706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.205045 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.219511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.236170 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.251218 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.266842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.266869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.266877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.266890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.266900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.270883 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.285127 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.303098 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.317748 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.321494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:09 crc kubenswrapper[4749]: E0225 07:19:09.321924 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.321999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:09 crc kubenswrapper[4749]: E0225 07:19:09.322208 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.321996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:09 crc kubenswrapper[4749]: E0225 07:19:09.322468 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.333240 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.342354 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.353333 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.371103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.371163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.371179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.371203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.371221 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.375854 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.395702 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.428056 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:09Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.478093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.478141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.478159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.478182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.478200 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.580821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.580898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.580916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.580942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.580962 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.683468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.683535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.683555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.683580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.683633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.787072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.787132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.787149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.787174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.787190 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.890710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.890776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.890792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.890814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.890829 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.994216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.994288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.994305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.994330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:09 crc kubenswrapper[4749]: I0225 07:19:09.994346 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:09Z","lastTransitionTime":"2026-02-25T07:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.097350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.097418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.097455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.097487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.097511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.200314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.200393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.200430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.200461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.200484 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.303676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.303727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.303744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.303770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.303787 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.321689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:10 crc kubenswrapper[4749]: E0225 07:19:10.321900 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.408222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.408287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.408306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.408334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.408359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.512279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.512350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.512370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.512397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.512418 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.615702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.615777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.615814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.615847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.615871 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.719216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.719275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.719289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.719308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.719323 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.823197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.823233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.823244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.823259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.823269 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.926774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.926856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.926878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.926912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:10 crc kubenswrapper[4749]: I0225 07:19:10.926937 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:10Z","lastTransitionTime":"2026-02-25T07:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.030021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.030061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.030070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.030085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.030095 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.133796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.133855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.133871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.133895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.133912 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.236519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.236573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.236586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.236620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.236633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.321892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.321906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:11 crc kubenswrapper[4749]: E0225 07:19:11.322118 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.321900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:11 crc kubenswrapper[4749]: E0225 07:19:11.322258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:11 crc kubenswrapper[4749]: E0225 07:19:11.322378 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.339387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.339424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.339434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.339448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.339460 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.442725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.442832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.442855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.442874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.442887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.546257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.546312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.546325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.546347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.546360 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.649267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.649337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.649351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.649367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.649378 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.751772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.751873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.751899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.751934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.751965 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.854790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.854848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.854864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.854888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.854902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.959491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.959566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.959625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.959675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:11 crc kubenswrapper[4749]: I0225 07:19:11.959702 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:11Z","lastTransitionTime":"2026-02-25T07:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.063437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.063508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.063525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.063550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.063569 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.166977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.167037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.167053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.167077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.167095 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.269960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.270014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.270031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.270054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.270071 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.321707 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:12 crc kubenswrapper[4749]: E0225 07:19:12.321935 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.373096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.373149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.373158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.373174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.373186 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.475644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.475708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.475726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.475755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.475771 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.577930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.578016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.578040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.578069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.578090 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.680586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.680653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.680661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.680679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.680691 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.782990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.783045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.783059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.783082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.783095 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.886443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.886505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.886517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.886537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.886549 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.989355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.989466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.989500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.989531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:12 crc kubenswrapper[4749]: I0225 07:19:12.989557 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:12Z","lastTransitionTime":"2026-02-25T07:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.092767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.092821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.092839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.092863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.092879 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.195213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.195273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.195290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.195314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.195330 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.298074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.298115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.298127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.298145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.298183 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.321326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.321408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.321348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:13 crc kubenswrapper[4749]: E0225 07:19:13.321490 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:13 crc kubenswrapper[4749]: E0225 07:19:13.321766 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:13 crc kubenswrapper[4749]: E0225 07:19:13.321985 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.401132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.401198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.401221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.401249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.401271 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.504284 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.504359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.504382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.504409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.504431 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.607636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.607703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.607720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.607747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.607764 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.710768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.710847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.710865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.710887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.710905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.814233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.814297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.814320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.814346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.814366 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.917741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.917787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.917799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.917815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:13 crc kubenswrapper[4749]: I0225 07:19:13.917829 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:13Z","lastTransitionTime":"2026-02-25T07:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.021290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.021340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.021356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.021382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.021404 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.124497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.124562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.124579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.124631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.124649 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.227785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.227818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.227827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.227839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.227847 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.321367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:14 crc kubenswrapper[4749]: E0225 07:19:14.321677 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.330319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.330415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.330444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.330471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.330489 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.433632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.433678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.433693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.433715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.433730 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.535982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.536047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.536059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.536076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.536088 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.639222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.639309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.639331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.639359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.639380 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.742303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.742382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.742405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.742442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.742465 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.845859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.845953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.845976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.846005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.846022 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.949155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.949256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.949381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.949787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:14 crc kubenswrapper[4749]: I0225 07:19:14.949993 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:14Z","lastTransitionTime":"2026-02-25T07:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.053287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.053336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.053353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.053375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.053393 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.156232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.156285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.156303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.156326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.156344 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.259219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.259280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.259300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.259328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.259346 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.321381 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.321473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.321397 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:15 crc kubenswrapper[4749]: E0225 07:19:15.321707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:15 crc kubenswrapper[4749]: E0225 07:19:15.321887 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:15 crc kubenswrapper[4749]: E0225 07:19:15.322082 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.362450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.362518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.362532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.362551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.362953 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.466025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.466107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.466126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.466149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.466167 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.569230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.569293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.569307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.569333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.569348 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.672214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.672291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.672309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.672338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.672355 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.783417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.783533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.783559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.783591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.783651 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.887166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.887272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.887301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.887341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.887368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.990265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.990333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.990352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.990372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:15 crc kubenswrapper[4749]: I0225 07:19:15.990386 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:15Z","lastTransitionTime":"2026-02-25T07:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.093805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.093873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.093891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.093914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.093931 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.196755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.196818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.196838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.196864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.196882 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.300211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.300285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.300311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.300340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.300362 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.322210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:16 crc kubenswrapper[4749]: E0225 07:19:16.322403 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.403935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.404017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.404037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.404063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.404085 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.507418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.507475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.507494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.507516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.507532 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.610222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.610287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.610304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.610327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.610345 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.713910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.713975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.713994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.714027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.714056 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.816962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.817033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.817049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.817073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.817100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.920502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.920563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.920579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.920633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:16 crc kubenswrapper[4749]: I0225 07:19:16.920651 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:16Z","lastTransitionTime":"2026-02-25T07:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.023959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.024034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.024061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.024096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.024158 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.127211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.127268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.127287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.127311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.127406 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.230242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.230302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.230319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.230342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.230358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.321868 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.321938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:17 crc kubenswrapper[4749]: E0225 07:19:17.322066 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.322138 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:17 crc kubenswrapper[4749]: E0225 07:19:17.322263 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:17 crc kubenswrapper[4749]: E0225 07:19:17.322353 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.342401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.342555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.342581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.342680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.342738 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.348448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.370008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.404285 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.426818 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.442320 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.446665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.446742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.446767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.446800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.446826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.458715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.480009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.499427 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.515866 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.528712 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.541556 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.550262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.550312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.550333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.550357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.550375 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.555692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.568590 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.581548 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.594363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.611205 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.652970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.653005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.653016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.653030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.653043 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.755382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.755430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.755442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.755460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.755473 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.859207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.859263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.859281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.859303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.859319 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.963152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.963211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.963229 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.963254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.963271 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.976454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.976519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.976534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.976556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:17 crc kubenswrapper[4749]: I0225 07:19:17.976570 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:17Z","lastTransitionTime":"2026-02-25T07:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:17 crc kubenswrapper[4749]: E0225 07:19:17.995221 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:17Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.000042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.000090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.000107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.000128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.000144 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.018250 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.023199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.023242 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.023285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.023307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.023323 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.042715 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.052823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.052909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.053014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.053065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.053128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.072117 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.077563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.077668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.077697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.077725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.077747 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.097452 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.097701 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.099974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.100019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.100035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.100058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.100077 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.202162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.202203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.202265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.202288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.202301 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.305590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.305707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.305732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.305763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.305786 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.322155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:18 crc kubenswrapper[4749]: E0225 07:19:18.322330 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.409288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.409340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.409357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.409381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.409400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.512775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.512850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.512872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.512932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.512954 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.616369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.616426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.616443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.616469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.616493 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.719705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.719768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.719784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.719810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.719836 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.753681 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.771100 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.792778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.822650 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.822701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.822721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.822745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.822762 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.824952 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.846054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.869233 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.885400 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.906494 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.923302 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.926159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.926228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.926253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.926278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.926297 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:18Z","lastTransitionTime":"2026-02-25T07:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.936586 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.950858 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.966909 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:18 crc kubenswrapper[4749]: I0225 07:19:18.981972 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:18Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.017128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:19Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.028825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.028891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.028910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.028937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.028955 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.053820 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:19Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.079150 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:19Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.098493 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:19Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.131993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.132058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.132075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.132100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.132117 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.235198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.235275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.235299 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.235326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.235350 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.321698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.321731 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.321717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:19 crc kubenswrapper[4749]: E0225 07:19:19.321870 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:19 crc kubenswrapper[4749]: E0225 07:19:19.322077 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:19 crc kubenswrapper[4749]: E0225 07:19:19.322270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.337658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.337716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.337771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.337799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.337821 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.446939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.446989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.447007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.447032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.447051 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.550064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.550102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.550113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.550128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.550139 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.652584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.652676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.652687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.652738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.652751 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.755171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.755222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.755237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.755259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.755275 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.858278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.858324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.858333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.858351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.858360 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.961290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.961349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.961367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.961391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:19 crc kubenswrapper[4749]: I0225 07:19:19.961408 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:19Z","lastTransitionTime":"2026-02-25T07:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.064954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.065067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.065097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.065177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.065253 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.167586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.167667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.167679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.167719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.167734 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.271679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.271746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.271756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.271784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.271795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.321568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:20 crc kubenswrapper[4749]: E0225 07:19:20.321804 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.374644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.374716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.374740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.374771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.374793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.477302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.477368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.477393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.477421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.477442 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.579872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.579928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.579940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.579957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.579971 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.683535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.683671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.683708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.683739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.683763 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.786981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.787084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.787111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.787140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.787162 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.889528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.889564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.889576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.889614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.889627 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.992157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.992217 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.992233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.992256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:20 crc kubenswrapper[4749]: I0225 07:19:20.992272 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:20Z","lastTransitionTime":"2026-02-25T07:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.095765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.095821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.095857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.095880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.095898 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.198551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.198632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.198647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.198668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.198682 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.303059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.303186 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.303258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.303283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.303305 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.322098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.322175 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:21 crc kubenswrapper[4749]: E0225 07:19:21.322248 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.322312 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:21 crc kubenswrapper[4749]: E0225 07:19:21.322569 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:21 crc kubenswrapper[4749]: E0225 07:19:21.322795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.406206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.406272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.406291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.406318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.406341 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.508833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.508885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.508896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.508913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.508923 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.611553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.611639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.611673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.611690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.611701 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.714379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.714411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.714436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.714453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.714462 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.833061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.833137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.833160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.833188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.833214 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.936287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.936352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.936374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.936403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:21 crc kubenswrapper[4749]: I0225 07:19:21.936426 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:21Z","lastTransitionTime":"2026-02-25T07:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.039987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.040059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.040073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.040107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.040120 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.149359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.149443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.149470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.149504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.149528 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.252900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.252962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.252973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.252994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.253007 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.321914 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:22 crc kubenswrapper[4749]: E0225 07:19:22.322131 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.323466 4749 scope.go:117] "RemoveContainer" containerID="3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.356423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.356529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.356590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.356738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.356776 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.461698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.464114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.464201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.464241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.464266 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.567741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.567790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.567809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.567832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.567850 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.669880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.669932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.669943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.669958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.669979 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.773583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.773678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.773696 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.773723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.773741 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.905544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.905610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.905624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.905641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:22 crc kubenswrapper[4749]: I0225 07:19:22.905654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:22Z","lastTransitionTime":"2026-02-25T07:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.008340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.008402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.008427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.008454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.008473 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.098186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.098377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.098422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.098475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.098507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098688 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098711 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098729 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.098778045 +0000 UTC m=+148.460604085 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098832 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.098822086 +0000 UTC m=+148.460648116 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098825 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098893 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098928 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098946 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098949 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.098921608 +0000 UTC m=+148.460747658 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.098895 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.099011 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.09898696 +0000 UTC m=+148.460813080 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.099092 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.099066662 +0000 UTC m=+148.460892992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.111268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.111352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.111372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.111394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.111409 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.199280 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.199494 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.199645 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:19:55.199616671 +0000 UTC m=+148.561442721 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.203408 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/1.log" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.206475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.206835 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.213655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.213684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.213693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.213704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.213712 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.223458 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.238830 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.253753 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.268973 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.281110 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.293963 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.307093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.315930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.315981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.316043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.316066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.316083 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.322276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.322445 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.322531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.322538 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.322799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:23 crc kubenswrapper[4749]: E0225 07:19:23.322866 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.324352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.335103 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.345157 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.368375 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.381993 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.395866 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.418899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.418962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.418978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.419003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.419021 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.427455 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.445854 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.461558 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:23Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.523128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.523188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.523203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.523234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.523247 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.626858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.626910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.626936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.626971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.626988 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.730478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.730538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.730554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.730590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.730649 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.834073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.834147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.834170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.834197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.834218 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.937450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.937526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.937549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.937579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:23 crc kubenswrapper[4749]: I0225 07:19:23.937638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:23Z","lastTransitionTime":"2026-02-25T07:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.040507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.040565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.040589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.040652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.040676 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.143391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.143453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.143476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.143508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.143535 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.212302 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/2.log" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.213668 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/1.log" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.218308 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" exitCode=1 Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.218466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.218568 4749 scope.go:117] "RemoveContainer" containerID="3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.219588 4749 scope.go:117] "RemoveContainer" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" Feb 25 07:19:24 crc kubenswrapper[4749]: E0225 07:19:24.220009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.240424 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.246326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.246358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.246366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.246380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.246393 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.257838 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.281909 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.296672 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.315317 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.321971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:24 crc kubenswrapper[4749]: E0225 07:19:24.322127 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.334027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.348926 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.351040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.351538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.351777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.351929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.352111 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.371006 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.391331 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.408371 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.440762 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bc6a1f8dd42855f0c652e12838cab6c6c7b2f0ed94658c27963bcfe198b2549\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:05Z\\\",\\\"message\\\":\\\"ics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0225 07:19:05.609195 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0225 07:19:05.609202 6745 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0225 07:19:05.609263 6745 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.455395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.455452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.455468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.455492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.455509 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.462726 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.486044 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.523697 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.552423 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.558546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.558584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.558620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.558638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.558669 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.570872 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:24Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.661694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.661741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.661759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.661782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.661799 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.763992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.764258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.764362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.764455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.764562 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.867392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.867465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.867488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.867516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.867538 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.971085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.971147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.971170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.971202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:24 crc kubenswrapper[4749]: I0225 07:19:24.971223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:24Z","lastTransitionTime":"2026-02-25T07:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.074025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.074081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.074099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.074122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.074141 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.178191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.178568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.178847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.179037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.179236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.225285 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/2.log" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.230411 4749 scope.go:117] "RemoveContainer" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" Feb 25 07:19:25 crc kubenswrapper[4749]: E0225 07:19:25.230789 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.242435 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.255302 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.273698 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.282536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.282869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.283265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.283480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.283755 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.289551 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.306793 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.321312 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.321396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:25 crc kubenswrapper[4749]: E0225 07:19:25.321474 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:25 crc kubenswrapper[4749]: E0225 07:19:25.321589 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.321933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:25 crc kubenswrapper[4749]: E0225 07:19:25.322138 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.324217 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.342202 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.362659 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.381382 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.385998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.386144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.386222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.386289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.386345 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.393793 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.405069 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.419288 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.433055 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.446188 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.458538 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.479353 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:25Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.489020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.489046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.489055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.489068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.489078 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.591844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.592120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.592281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.592398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.592519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.695431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.695476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.695491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.695511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.695527 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.798806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.798846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.798858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.798875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.798887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.902280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.902351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.902378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.902407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:25 crc kubenswrapper[4749]: I0225 07:19:25.902431 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:25Z","lastTransitionTime":"2026-02-25T07:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.005783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.005837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.005856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.005878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.005895 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.109671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.109734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.109754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.109781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.109808 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.212564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.213033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.213203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.213355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.213484 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.316144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.316203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.316224 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.316250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.316272 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.321898 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:26 crc kubenswrapper[4749]: E0225 07:19:26.322114 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.419402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.419897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.420102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.420307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.420517 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.523752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.524042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.524219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.524490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.524766 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.627741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.627789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.627802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.627820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.627831 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.731012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.731065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.731082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.731104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.731122 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.834375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.834428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.834446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.834470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.834487 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.942881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.942950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.942970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.943039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:26 crc kubenswrapper[4749]: I0225 07:19:26.943069 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:26Z","lastTransitionTime":"2026-02-25T07:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.046276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.046313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.046324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.046339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.046351 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:27Z","lastTransitionTime":"2026-02-25T07:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.149066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.149125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.149172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.149202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.149223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:27Z","lastTransitionTime":"2026-02-25T07:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:27 crc kubenswrapper[4749]: E0225 07:19:27.249981 4749 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.321351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.321423 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:27 crc kubenswrapper[4749]: E0225 07:19:27.321557 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.321628 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:27 crc kubenswrapper[4749]: E0225 07:19:27.321741 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:27 crc kubenswrapper[4749]: E0225 07:19:27.321912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.344159 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.376918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.396777 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.410980 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: E0225 07:19:27.419211 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.436680 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.455196 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.474646 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.497468 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.516664 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.533125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.546180 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.558313 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.575182 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.593843 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.611012 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:27 crc kubenswrapper[4749]: I0225 07:19:27.639527 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:27Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.268105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.268202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.268224 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.268250 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.268268 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:28Z","lastTransitionTime":"2026-02-25T07:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.291403 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:28Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.296741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.296795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.296818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.296848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.296869 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:28Z","lastTransitionTime":"2026-02-25T07:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.317583 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:28Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.321966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.322212 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.323214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.323266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.323289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.323316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.323337 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:28Z","lastTransitionTime":"2026-02-25T07:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.342297 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:28Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.348461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.348505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.348523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.348545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.348563 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:28Z","lastTransitionTime":"2026-02-25T07:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.368311 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:28Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.373366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.373423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.373441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.373469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:28 crc kubenswrapper[4749]: I0225 07:19:28.373489 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:28Z","lastTransitionTime":"2026-02-25T07:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.392990 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:28Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:28 crc kubenswrapper[4749]: E0225 07:19:28.393207 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:19:29 crc kubenswrapper[4749]: I0225 07:19:29.321926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:29 crc kubenswrapper[4749]: I0225 07:19:29.322044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:29 crc kubenswrapper[4749]: I0225 07:19:29.321947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:29 crc kubenswrapper[4749]: E0225 07:19:29.322207 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:29 crc kubenswrapper[4749]: E0225 07:19:29.322653 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:29 crc kubenswrapper[4749]: E0225 07:19:29.322404 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:30 crc kubenswrapper[4749]: I0225 07:19:30.322129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:30 crc kubenswrapper[4749]: E0225 07:19:30.322309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:31 crc kubenswrapper[4749]: I0225 07:19:31.321800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:31 crc kubenswrapper[4749]: I0225 07:19:31.321882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:31 crc kubenswrapper[4749]: I0225 07:19:31.321800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:31 crc kubenswrapper[4749]: E0225 07:19:31.322040 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:31 crc kubenswrapper[4749]: E0225 07:19:31.322193 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:31 crc kubenswrapper[4749]: E0225 07:19:31.322362 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:31 crc kubenswrapper[4749]: I0225 07:19:31.338354 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 25 07:19:32 crc kubenswrapper[4749]: I0225 07:19:32.321963 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:32 crc kubenswrapper[4749]: E0225 07:19:32.322187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:32 crc kubenswrapper[4749]: E0225 07:19:32.420907 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:33 crc kubenswrapper[4749]: I0225 07:19:33.322363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:33 crc kubenswrapper[4749]: I0225 07:19:33.322378 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:33 crc kubenswrapper[4749]: I0225 07:19:33.322417 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:33 crc kubenswrapper[4749]: E0225 07:19:33.322549 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:33 crc kubenswrapper[4749]: E0225 07:19:33.322713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:33 crc kubenswrapper[4749]: E0225 07:19:33.322809 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:34 crc kubenswrapper[4749]: I0225 07:19:34.321628 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:34 crc kubenswrapper[4749]: E0225 07:19:34.321779 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:35 crc kubenswrapper[4749]: I0225 07:19:35.321867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:35 crc kubenswrapper[4749]: I0225 07:19:35.321954 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:35 crc kubenswrapper[4749]: I0225 07:19:35.321867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:35 crc kubenswrapper[4749]: E0225 07:19:35.322113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:35 crc kubenswrapper[4749]: E0225 07:19:35.322380 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:35 crc kubenswrapper[4749]: E0225 07:19:35.322520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:36 crc kubenswrapper[4749]: I0225 07:19:36.321637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:36 crc kubenswrapper[4749]: E0225 07:19:36.321817 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.322176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.322298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.322351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:37 crc kubenswrapper[4749]: E0225 07:19:37.322349 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:37 crc kubenswrapper[4749]: E0225 07:19:37.322625 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:37 crc kubenswrapper[4749]: E0225 07:19:37.322808 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.323884 4749 scope.go:117] "RemoveContainer" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" Feb 25 07:19:37 crc kubenswrapper[4749]: E0225 07:19:37.324293 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.341775 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.362232 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.377778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.396261 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.412328 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: E0225 07:19:37.422220 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.432570 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.450455 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.465880 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.498255 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.519663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.537169 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.561530 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.582183 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.594887 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.608814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.625367 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:37 crc kubenswrapper[4749]: I0225 07:19:37.642356 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:37Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.321821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.321969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.604272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.604333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.604351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.604375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.604395 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:38Z","lastTransitionTime":"2026-02-25T07:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.623590 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:38Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.628449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.628513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.628541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.628572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.628643 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:38Z","lastTransitionTime":"2026-02-25T07:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.648630 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:38Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.652986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.653039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.653057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.653082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.653100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:38Z","lastTransitionTime":"2026-02-25T07:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.672958 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:38Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.677311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.677374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.677395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.677416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.677433 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:38Z","lastTransitionTime":"2026-02-25T07:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.697790 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:38Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.702389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.702473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.702527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.702549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:38 crc kubenswrapper[4749]: I0225 07:19:38.702566 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:38Z","lastTransitionTime":"2026-02-25T07:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.723578 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:38Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:38 crc kubenswrapper[4749]: E0225 07:19:38.723923 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:19:39 crc kubenswrapper[4749]: I0225 07:19:39.321659 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:39 crc kubenswrapper[4749]: I0225 07:19:39.321794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:39 crc kubenswrapper[4749]: I0225 07:19:39.322040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:39 crc kubenswrapper[4749]: E0225 07:19:39.322046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:39 crc kubenswrapper[4749]: E0225 07:19:39.322236 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:39 crc kubenswrapper[4749]: E0225 07:19:39.322551 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.285056 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/0.log" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.285137 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c23d4e-91a8-4374-84dc-7bdc7450661d" containerID="c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6" exitCode=1 Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.285179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerDied","Data":"c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6"} Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.285775 4749 scope.go:117] "RemoveContainer" containerID="c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.307250 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.321862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:40 crc kubenswrapper[4749]: E0225 07:19:40.322581 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.327430 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.345903 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.377500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.393481 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.410132 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.426473 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.447300 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.461199 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.469545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.480509 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.490363 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.497644 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.509994 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.521466 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.531448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:40 crc kubenswrapper[4749]: I0225 07:19:40.546413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:40Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.292210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/0.log" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.292291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerStarted","Data":"f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5"} Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.313577 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.321419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.321526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:41 crc kubenswrapper[4749]: E0225 07:19:41.321641 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.321716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:41 crc kubenswrapper[4749]: E0225 07:19:41.321791 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:41 crc kubenswrapper[4749]: E0225 07:19:41.322085 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.334735 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.369199 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.391975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.407386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.423314 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.442414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.463046 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.483230 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.503020 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.522501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.539144 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.556167 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.572337 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.589085 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.604329 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:41 crc kubenswrapper[4749]: I0225 07:19:41.628787 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:41Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:42 crc kubenswrapper[4749]: I0225 07:19:42.322183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:42 crc kubenswrapper[4749]: E0225 07:19:42.322455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:42 crc kubenswrapper[4749]: E0225 07:19:42.423801 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:43 crc kubenswrapper[4749]: I0225 07:19:43.321858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:43 crc kubenswrapper[4749]: E0225 07:19:43.322239 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:43 crc kubenswrapper[4749]: I0225 07:19:43.322563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:43 crc kubenswrapper[4749]: E0225 07:19:43.322701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:43 crc kubenswrapper[4749]: I0225 07:19:43.323006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:43 crc kubenswrapper[4749]: E0225 07:19:43.323455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:43 crc kubenswrapper[4749]: I0225 07:19:43.337975 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 25 07:19:44 crc kubenswrapper[4749]: I0225 07:19:44.321530 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:44 crc kubenswrapper[4749]: E0225 07:19:44.321740 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:45 crc kubenswrapper[4749]: I0225 07:19:45.322048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:45 crc kubenswrapper[4749]: I0225 07:19:45.322101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:45 crc kubenswrapper[4749]: E0225 07:19:45.322239 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:45 crc kubenswrapper[4749]: I0225 07:19:45.322325 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:45 crc kubenswrapper[4749]: E0225 07:19:45.322471 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:45 crc kubenswrapper[4749]: E0225 07:19:45.322647 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:46 crc kubenswrapper[4749]: I0225 07:19:46.321832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:46 crc kubenswrapper[4749]: E0225 07:19:46.322061 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.321847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:47 crc kubenswrapper[4749]: E0225 07:19:47.322252 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.322293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.322361 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:47 crc kubenswrapper[4749]: E0225 07:19:47.322473 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:47 crc kubenswrapper[4749]: E0225 07:19:47.322641 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.338279 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b188c20-3f25-4dd5-a437-2ccf3011bd54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:00Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 07:17:29.758751 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 07:17:29.760823 1 observer_polling.go:159] Starting file observer\\\\nI0225 07:17:29.800886 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 07:17:29.804973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 07:18:00.214346 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.359310 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.381961 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.422897 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: E0225 07:19:47.424520 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.443268 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.459658 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.480675 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.499099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.521854 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.538452 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.553538 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.571204 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.586817 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.601795 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.622173 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.639164 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.656162 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:47 crc kubenswrapper[4749]: I0225 07:19:47.688829 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:47Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:48 crc kubenswrapper[4749]: I0225 07:19:48.321276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:48 crc kubenswrapper[4749]: E0225 07:19:48.321445 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.108360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.108399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.108408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.108449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.108460 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:49Z","lastTransitionTime":"2026-02-25T07:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.120897 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:49Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.125290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.125329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.125342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.125359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.125370 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:49Z","lastTransitionTime":"2026-02-25T07:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.140588 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:49Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.144283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.144331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.144342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.144358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.144369 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:49Z","lastTransitionTime":"2026-02-25T07:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.157809 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:49Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.161062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.161110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.161122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.161139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.161153 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:49Z","lastTransitionTime":"2026-02-25T07:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.173323 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:49Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.177387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.177578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.177697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.177838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.177954 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:49Z","lastTransitionTime":"2026-02-25T07:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.190198 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:49Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.190662 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.321845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.322365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:49 crc kubenswrapper[4749]: I0225 07:19:49.322909 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.322887 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.323429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:49 crc kubenswrapper[4749]: E0225 07:19:49.323721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:50 crc kubenswrapper[4749]: I0225 07:19:50.322314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:50 crc kubenswrapper[4749]: E0225 07:19:50.322480 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:50 crc kubenswrapper[4749]: I0225 07:19:50.323537 4749 scope.go:117] "RemoveContainer" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.321299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.321434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.321512 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:51 crc kubenswrapper[4749]: E0225 07:19:51.321837 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:51 crc kubenswrapper[4749]: E0225 07:19:51.322207 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:51 crc kubenswrapper[4749]: E0225 07:19:51.322417 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.333029 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/2.log" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.338249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.338923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.358014 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.375995 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.401301 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.425831 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.445277 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.462546 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.482907 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.498394 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.510778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.532162 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.549886 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.567342 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.620413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.633618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.643773 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.664779 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.676147 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b188c20-3f25-4dd5-a437-2ccf3011bd54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:00Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 07:17:29.758751 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 07:17:29.760823 1 observer_polling.go:159] Starting file observer\\\\nI0225 07:17:29.800886 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 07:17:29.804973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 07:18:00.214346 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:51 crc kubenswrapper[4749]: I0225 07:19:51.685952 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:51Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.322083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:52 crc kubenswrapper[4749]: E0225 07:19:52.322282 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.344997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/3.log" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.346336 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/2.log" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.350913 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" exitCode=1 Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.350969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.351017 4749 scope.go:117] "RemoveContainer" containerID="2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.352028 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:19:52 crc kubenswrapper[4749]: E0225 07:19:52.352261 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.388526 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.403459 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b188c20-3f25-4dd5-a437-2ccf3011bd54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:00Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 07:17:29.758751 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 07:17:29.760823 1 observer_polling.go:159] Starting file observer\\\\nI0225 07:17:29.800886 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 07:17:29.804973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 07:18:00.214346 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: E0225 07:19:52.425886 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.430807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.446205 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.461858 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.480506 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.495869 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.509384 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.525106 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.548627 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.567289 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.583667 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.595662 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.609776 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.627390 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.642722 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.663156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:52 crc kubenswrapper[4749]: I0225 07:19:52.684155 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c5824af03d65822da94b31ae73d8cefbde5869dd3ea810791fe0a3e7d055471\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:23Z\\\",\\\"message\\\":\\\"rafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0225 07:19:23.322225 7004 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager-operator/metrics for network=default are: map[]\\\\nI0225 07:19:23.322232 7004 services_controller.go:443] Built service openshift-kube-controller-manager-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0225 07:19:23.322241 7004 services_controller.go:444] Built service openshift-kube-controller-manager-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0225 07:19:23.322246 7004 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-h66ds] creating logical port openshift-multus_network-metrics-daemon-h66ds for pod on switch crc\\\\nF0225 07:19:23.322260 7004 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:51Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0225 07:19:51.629378 7281 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.629614 7281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:51.629616 7281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.630038 7281 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 07:19:51.630089 7281 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 07:19:51.630095 7281 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 07:19:51.630116 7281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0225 07:19:51.630137 7281 factory.go:656] Stopping watch factory\\\\nI0225 07:19:51.630147 7281 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 07:19:51.630153 7281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:51.630161 7281 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:51.630164 7281 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:52Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.322223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.322318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:53 crc kubenswrapper[4749]: E0225 07:19:53.322428 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.322529 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:53 crc kubenswrapper[4749]: E0225 07:19:53.322781 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:53 crc kubenswrapper[4749]: E0225 07:19:53.322911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.356343 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/3.log" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.361067 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:19:53 crc kubenswrapper[4749]: E0225 07:19:53.361330 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.373341 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.395120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.412921 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.431415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.450938 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.471236 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.490685 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.516502 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.547398 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:51Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0225 07:19:51.629378 7281 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.629614 7281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:51.629616 7281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.630038 7281 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 07:19:51.630089 7281 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 07:19:51.630095 7281 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 07:19:51.630116 7281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0225 07:19:51.630137 7281 factory.go:656] Stopping watch factory\\\\nI0225 07:19:51.630147 7281 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 07:19:51.630153 7281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:51.630161 7281 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:51.630164 7281 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.567355 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b188c20-3f25-4dd5-a437-2ccf3011bd54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:00Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 07:17:29.758751 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 07:17:29.760823 1 observer_polling.go:159] Starting file observer\\\\nI0225 07:17:29.800886 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 07:17:29.804973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 07:18:00.214346 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.580946 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.605143 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.640936 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.665403 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.681963 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.700713 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.719528 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:53 crc kubenswrapper[4749]: I0225 07:19:53.739267 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:53Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:54 crc kubenswrapper[4749]: I0225 07:19:54.322124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:54 crc kubenswrapper[4749]: E0225 07:19:54.322273 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.162687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.162856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.162905 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.162867663 +0000 UTC m=+212.524693723 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.162976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163036 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.163050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163064 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.163154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163259 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163327 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163350 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163369 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163386 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.163358748 +0000 UTC m=+212.525184808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163253 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163449 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.163420909 +0000 UTC m=+212.525246969 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163500 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.163482051 +0000 UTC m=+212.525308111 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163132 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.163646 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.163582854 +0000 UTC m=+212.525408904 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.263902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.264176 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.264316 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs podName:33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2 nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.26428982 +0000 UTC m=+212.626115880 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs") pod "network-metrics-daemon-h66ds" (UID: "33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.321962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.321982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:55 crc kubenswrapper[4749]: I0225 07:19:55.322143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.322316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.322460 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:55 crc kubenswrapper[4749]: E0225 07:19:55.322981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:56 crc kubenswrapper[4749]: I0225 07:19:56.322113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:56 crc kubenswrapper[4749]: E0225 07:19:56.322623 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.321834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:57 crc kubenswrapper[4749]: E0225 07:19:57.321988 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.322306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:57 crc kubenswrapper[4749]: E0225 07:19:57.322535 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.324824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:57 crc kubenswrapper[4749]: E0225 07:19:57.325035 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.339576 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b188c20-3f25-4dd5-a437-2ccf3011bd54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca475d00492be3bfb2b8281b1f1f94ae5b029e8c6fe30bbbc69a326883f23009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74ad0b7b6eb6ef72955cf36a90f39a930c30f40914d4657a22f2cef8da44ad7b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:00Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0225 07:17:29.758751 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0225 07:17:29.760823 1 observer_polling.go:159] Starting file observer\\\\nI0225 07:17:29.800886 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0225 07:17:29.804973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0225 07:18:00.214346 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f8ae6d6a181a5d061036f26b08f9cfd4870dad00ce2467bc24132859bc98c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://503c9440ce2959d7caac7c133ee796bbba4aa561da7f8685eaf574cdb5e998e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.359981 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a58a6b62cbeaefa34857043f108b0992bbc654953500f9e9df227ab52457ec52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8209fb6e9c397b573ca8b666b51b69ffcac421ddf3cfeef533141d2669160a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.376237 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qkp9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e94f971-4065-492a-822d-39734b6edf77\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://320c2eccf47d72a6eb78b62838748106a4d6718921155595fb1f0bc134cc42d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hxtjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qkp9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.396514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6de5aab1-70e2-4c55-8c06-59cc173abb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0225 07:18:20.008365 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0225 07:18:20.009082 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0225 07:18:20.009817 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3277287311/tls.crt::/tmp/serving-cert-3277287311/tls.key\\\\\\\"\\\\nI0225 07:18:20.358450 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0225 07:18:20.366920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0225 07:18:20.366952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0225 07:18:20.366984 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0225 07:18:20.366989 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0225 07:18:20.382838 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0225 07:18:20.383084 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383109 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0225 07:18:20.383130 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0225 07:18:20.383150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0225 07:18:20.383169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0225 07:18:20.383189 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0225 07:18:20.383495 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0225 07:18:20.387802 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: E0225 07:19:57.427415 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.428242 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d313fa3-6077-4183-9db2-5f94cc02dcf1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b85b3f91277923b86ccce667bcd3ab8d87ef0a9dc7bbf0897a72b28506d7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0134668a782b6a0cc81936248b1dd865e25bc0203cd221fdd6913658ca3a925b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed02161301ef3f4761439ba113d8a6e9bf561d52bf98c76a0ab853d7b8becb75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://592064d1d5302c305f54e20e7d0137e1c5f27e4cdac1617002cc98ba85d6bb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1097faeb2a2f6cdabc7347fbf4dad01a0c0a94cf63773831be2444b878d7edbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a58533ab932acfec670929503153ffbce86b5bc3d270a0b4862c32b53d68772\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42f0559f20f42c032de62c82af4a2512ff548aac933929e2586781e630a4c245\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc98be57f9b7c85c068456110efa983f9c984d203c003f66920423ca0fb6d9c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.448488 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b471620322be31fd2c74d632e35146da52a66a7b557a830e160963b79318690b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.466628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a665cc4-2925-4a4d-bd03-3a05d3dee6da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8b958c8be48cb9d84a64119ea908f93f490bc48ac52b7184a22c1d99e4e76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a4c82a22c54fe0ea8ad869d2cc2809820b0c9da27d4afdc3e66d132bbc916c2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f7c031925d79daa61ce360ffef2f9be47ab627a11bd0fd2d3fbb7b1aad34a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5635b7f7e9d2e5586efd2b9970b29d38128a922aa3847b586d1ed6c2e1838e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bfccc6d8ca4b346d2845a7ce68f9b6b9a883652c1502b4159b69490a6a2206c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd20f2a137c44a939576515fbcb35648bb55a33808c0684ffd35131d6094b77f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7505545dd700a4f1fd2b1e20752ca9b15ef7b1e9195182ff267510629f199133\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hw4qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmpqc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.487567 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.508221 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.526755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1183771e-2d52-421f-8c26-0aaff531934a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595e1a246abf157c37f98d2be2bdcfe171ae3bee5e130548e48b8eff9a980a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dvtjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ljd89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.538744 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h66ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4qfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h66ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.550623 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bkmjf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21c23d4e-91a8-4374-84dc-7bdc7450661d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:39Z\\\",\\\"message\\\":\\\"2026-02-25T07:18:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611\\\\n2026-02-25T07:18:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_28e4aa30-a65a-46d5-9215-546c39620611 to /host/opt/cni/bin/\\\\n2026-02-25T07:18:54Z [verbose] multus-daemon started\\\\n2026-02-25T07:18:54Z [verbose] Readiness Indicator file check\\\\n2026-02-25T07:19:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rltj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bkmjf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.562801 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4702e42e-e7d8-4126-b968-dabf40d0798f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c009be955f5fb7911cfb04ea731879ce1688397f200f18a9266a787d29dc52bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ddd91683340621e12c69957fe01483bbb0d192723c4fb847529d0dc530258a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2lvt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-mw54s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.578072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-89w9z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b170d49d-ba02-4991-8a52-79b7114d6a67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://863cbf04158f86a00f12fc166396b8c4cc629b580efb0ea3af328a963703deb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8x9w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-89w9z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.592448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5da6fc0-e312-42e7-8ec7-0fd72882a947\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:17:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44c0506218fcdd72bfc8b37561af25ad6a1439c7d59f0a01c4e4e20b6bd8f842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c677ed567931799faec665bd1b2bef6dcb10a5a27a7a1c869651e8a01b2ad655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c72861667f8f3cdae664ee65aae42f5f920b8b3d0e48657b21c24fbf47c6a7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:17:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d47da83ca24f64baca878d5ae5878a54d1ad8b3bd7c47aab9a33d6068aed0f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:17:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:17:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:17:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.607423 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.622103 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a9bb297bae3d5b45bb8887a022eaeb56b441199d9ba466f2e436f179085399c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:57 crc kubenswrapper[4749]: I0225 07:19:57.655132 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fae19e32-92e3-446f-9a38-85e8fef239dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-25T07:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-25T07:19:51Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0225 07:19:51.629378 7281 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.629614 7281 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0225 07:19:51.629616 7281 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0225 07:19:51.630038 7281 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0225 07:19:51.630089 7281 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0225 07:19:51.630095 7281 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0225 07:19:51.630116 7281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0225 07:19:51.630137 7281 factory.go:656] Stopping watch factory\\\\nI0225 07:19:51.630147 7281 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0225 07:19:51.630153 7281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0225 07:19:51.630161 7281 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0225 07:19:51.630164 7281 handler.go:208] Removed *v1.Pod ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-25T07:19:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-25T07:18:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-25T07:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-25T07:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klvlz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-25T07:18:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-r9pzm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:57Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:58 crc kubenswrapper[4749]: I0225 07:19:58.322267 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:19:58 crc kubenswrapper[4749]: E0225 07:19:58.322764 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.321585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.321695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.321692 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.321839 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.322052 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.322238 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.440131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.440185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.440201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.440224 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.440241 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:59Z","lastTransitionTime":"2026-02-25T07:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.463441 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.469356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.469465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.469486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.469512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.469530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:59Z","lastTransitionTime":"2026-02-25T07:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.492462 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.499380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.499545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.499693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.499813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.499923 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:59Z","lastTransitionTime":"2026-02-25T07:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.520310 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.525722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.525903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.526011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.526121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.526230 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:59Z","lastTransitionTime":"2026-02-25T07:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.568464 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.574021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.574159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.574227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.574306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:19:59 crc kubenswrapper[4749]: I0225 07:19:59.574540 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:19:59Z","lastTransitionTime":"2026-02-25T07:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.591538 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-25T07:19:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1fea3a71-dd7f-464c-9c92-1cf7272a2aba\\\",\\\"systemUUID\\\":\\\"5bb6cb94-61d9-4bd6-9698-2d2f87101f9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-25T07:19:59Z is after 2025-08-24T17:21:41Z" Feb 25 07:19:59 crc kubenswrapper[4749]: E0225 07:19:59.591974 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 25 07:20:00 crc kubenswrapper[4749]: I0225 07:20:00.321976 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:00 crc kubenswrapper[4749]: E0225 07:20:00.322209 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:01 crc kubenswrapper[4749]: I0225 07:20:01.321679 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:01 crc kubenswrapper[4749]: I0225 07:20:01.321936 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:01 crc kubenswrapper[4749]: I0225 07:20:01.322022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:01 crc kubenswrapper[4749]: E0225 07:20:01.322213 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:01 crc kubenswrapper[4749]: E0225 07:20:01.322714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:01 crc kubenswrapper[4749]: E0225 07:20:01.322825 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:02 crc kubenswrapper[4749]: I0225 07:20:02.321983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:02 crc kubenswrapper[4749]: E0225 07:20:02.322643 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:02 crc kubenswrapper[4749]: I0225 07:20:02.337161 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 25 07:20:02 crc kubenswrapper[4749]: E0225 07:20:02.429210 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:03 crc kubenswrapper[4749]: I0225 07:20:03.321849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:03 crc kubenswrapper[4749]: E0225 07:20:03.322248 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:03 crc kubenswrapper[4749]: I0225 07:20:03.321980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:03 crc kubenswrapper[4749]: E0225 07:20:03.322508 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:03 crc kubenswrapper[4749]: I0225 07:20:03.321927 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:03 crc kubenswrapper[4749]: E0225 07:20:03.322915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:04 crc kubenswrapper[4749]: I0225 07:20:04.322128 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:04 crc kubenswrapper[4749]: E0225 07:20:04.322970 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:05 crc kubenswrapper[4749]: I0225 07:20:05.322201 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:05 crc kubenswrapper[4749]: E0225 07:20:05.322418 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:05 crc kubenswrapper[4749]: I0225 07:20:05.322558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:05 crc kubenswrapper[4749]: I0225 07:20:05.322850 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:05 crc kubenswrapper[4749]: E0225 07:20:05.323312 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:05 crc kubenswrapper[4749]: E0225 07:20:05.323435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:06 crc kubenswrapper[4749]: I0225 07:20:06.321816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:06 crc kubenswrapper[4749]: E0225 07:20:06.322175 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:06 crc kubenswrapper[4749]: I0225 07:20:06.323439 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:20:06 crc kubenswrapper[4749]: E0225 07:20:06.323740 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.321655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.321915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:07 crc kubenswrapper[4749]: E0225 07:20:07.322017 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.322155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:07 crc kubenswrapper[4749]: E0225 07:20:07.322397 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:07 crc kubenswrapper[4749]: E0225 07:20:07.323060 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.357336 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-89w9z" podStartSLOduration=107.357312866 podStartE2EDuration="1m47.357312866s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.357014187 +0000 UTC m=+160.718840247" watchObservedRunningTime="2026-02-25 07:20:07.357312866 +0000 UTC m=+160.719138916" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.406657 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=36.406584553 podStartE2EDuration="36.406584553s" podCreationTimestamp="2026-02-25 07:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.382743962 +0000 UTC m=+160.744570032" watchObservedRunningTime="2026-02-25 07:20:07.406584553 +0000 UTC m=+160.768410583" Feb 25 07:20:07 crc kubenswrapper[4749]: E0225 07:20:07.430414 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.459703 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podStartSLOduration=107.459684048 podStartE2EDuration="1m47.459684048s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.440983101 +0000 UTC m=+160.802809141" watchObservedRunningTime="2026-02-25 07:20:07.459684048 +0000 UTC m=+160.821510078" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.502216 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bkmjf" podStartSLOduration=107.502183144 podStartE2EDuration="1m47.502183144s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.479785264 +0000 UTC m=+160.841611334" watchObservedRunningTime="2026-02-25 07:20:07.502183144 +0000 UTC m=+160.864009204" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.546967 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-mw54s" podStartSLOduration=107.546947635 podStartE2EDuration="1m47.546947635s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.503206434 +0000 UTC m=+160.865032484" watchObservedRunningTime="2026-02-25 07:20:07.546947635 +0000 UTC m=+160.908773665" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.569453 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=24.569425738 podStartE2EDuration="24.569425738s" podCreationTimestamp="2026-02-25 07:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.568839572 +0000 UTC m=+160.930665632" watchObservedRunningTime="2026-02-25 07:20:07.569425738 +0000 UTC m=+160.931251798" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.612247 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.612217673 podStartE2EDuration="1m13.612217673s" podCreationTimestamp="2026-02-25 07:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.610947537 +0000 UTC m=+160.972773597" watchObservedRunningTime="2026-02-25 07:20:07.612217673 +0000 UTC m=+160.974043743" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.648669 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=58.648651149 podStartE2EDuration="58.648651149s" podCreationTimestamp="2026-02-25 07:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.648114393 +0000 UTC m=+161.009940423" watchObservedRunningTime="2026-02-25 07:20:07.648651149 +0000 UTC m=+161.010477179" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.676982 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qkp9r" podStartSLOduration=107.676958586 podStartE2EDuration="1m47.676958586s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.675832124 +0000 UTC m=+161.037658154" watchObservedRunningTime="2026-02-25 07:20:07.676958586 +0000 UTC m=+161.038784646" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.692485 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.692463812 podStartE2EDuration="5.692463812s" podCreationTimestamp="2026-02-25 07:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.690727924 +0000 UTC m=+161.052553984" watchObservedRunningTime="2026-02-25 07:20:07.692463812 +0000 UTC m=+161.054289882" Feb 25 07:20:07 crc kubenswrapper[4749]: I0225 07:20:07.751208 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tmpqc" podStartSLOduration=107.751184716 podStartE2EDuration="1m47.751184716s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:07.750577169 +0000 UTC m=+161.112403199" watchObservedRunningTime="2026-02-25 07:20:07.751184716 +0000 UTC m=+161.113010756" Feb 25 07:20:08 crc kubenswrapper[4749]: I0225 07:20:08.321805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:08 crc kubenswrapper[4749]: E0225 07:20:08.321997 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.321969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.321998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.322129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:09 crc kubenswrapper[4749]: E0225 07:20:09.322334 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:09 crc kubenswrapper[4749]: E0225 07:20:09.322466 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:09 crc kubenswrapper[4749]: E0225 07:20:09.322701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.917886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.917978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.918004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.918031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.918050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-25T07:20:09Z","lastTransitionTime":"2026-02-25T07:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.973445 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm"] Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.974064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.977571 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.977778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.977905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 07:20:09 crc kubenswrapper[4749]: I0225 07:20:09.979148 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.016520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.016923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.016962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2f74124-67bd-46b6-9b06-0cea473c8760-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.017114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2f74124-67bd-46b6-9b06-0cea473c8760-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.017293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2f74124-67bd-46b6-9b06-0cea473c8760-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.118859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2f74124-67bd-46b6-9b06-0cea473c8760-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.118984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2f74124-67bd-46b6-9b06-0cea473c8760-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.119052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.119086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.119122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2f74124-67bd-46b6-9b06-0cea473c8760-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.119254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.119259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e2f74124-67bd-46b6-9b06-0cea473c8760-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.120163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2f74124-67bd-46b6-9b06-0cea473c8760-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.129439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2f74124-67bd-46b6-9b06-0cea473c8760-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.148103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2f74124-67bd-46b6-9b06-0cea473c8760-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lswpm\" (UID: \"e2f74124-67bd-46b6-9b06-0cea473c8760\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.296590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.321316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:10 crc kubenswrapper[4749]: E0225 07:20:10.321504 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.343953 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.354452 4749 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 25 07:20:10 crc kubenswrapper[4749]: I0225 07:20:10.425974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" event={"ID":"e2f74124-67bd-46b6-9b06-0cea473c8760","Type":"ContainerStarted","Data":"4e131353d5c462b206ec2d5e5478954c9dacd0d504dd40f674f5cbb847fcebf1"} Feb 25 07:20:11 crc kubenswrapper[4749]: I0225 07:20:11.321430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:11 crc kubenswrapper[4749]: I0225 07:20:11.321471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:11 crc kubenswrapper[4749]: E0225 07:20:11.322117 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:11 crc kubenswrapper[4749]: E0225 07:20:11.321931 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:11 crc kubenswrapper[4749]: I0225 07:20:11.321724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:11 crc kubenswrapper[4749]: E0225 07:20:11.322259 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:11 crc kubenswrapper[4749]: I0225 07:20:11.431977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" event={"ID":"e2f74124-67bd-46b6-9b06-0cea473c8760","Type":"ContainerStarted","Data":"9ced6b8cea2330b2fae6d5ab0d3c452452180e51a05d80bd12bc75fe6e820e12"} Feb 25 07:20:12 crc kubenswrapper[4749]: I0225 07:20:12.321753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:12 crc kubenswrapper[4749]: E0225 07:20:12.322280 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:12 crc kubenswrapper[4749]: E0225 07:20:12.432001 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:13 crc kubenswrapper[4749]: I0225 07:20:13.321720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:13 crc kubenswrapper[4749]: I0225 07:20:13.321836 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:13 crc kubenswrapper[4749]: I0225 07:20:13.322149 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:13 crc kubenswrapper[4749]: E0225 07:20:13.322143 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:13 crc kubenswrapper[4749]: E0225 07:20:13.322290 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:13 crc kubenswrapper[4749]: E0225 07:20:13.322435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:14 crc kubenswrapper[4749]: I0225 07:20:14.321926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:14 crc kubenswrapper[4749]: E0225 07:20:14.322125 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:15 crc kubenswrapper[4749]: I0225 07:20:15.321937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:15 crc kubenswrapper[4749]: I0225 07:20:15.321965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:15 crc kubenswrapper[4749]: I0225 07:20:15.322028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:15 crc kubenswrapper[4749]: E0225 07:20:15.322565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:15 crc kubenswrapper[4749]: E0225 07:20:15.322688 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:15 crc kubenswrapper[4749]: E0225 07:20:15.323679 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:16 crc kubenswrapper[4749]: I0225 07:20:16.321676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:16 crc kubenswrapper[4749]: E0225 07:20:16.322237 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:17 crc kubenswrapper[4749]: I0225 07:20:17.322105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:17 crc kubenswrapper[4749]: I0225 07:20:17.322125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:17 crc kubenswrapper[4749]: I0225 07:20:17.322183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:17 crc kubenswrapper[4749]: E0225 07:20:17.324408 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:17 crc kubenswrapper[4749]: E0225 07:20:17.325428 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:17 crc kubenswrapper[4749]: E0225 07:20:17.324576 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:17 crc kubenswrapper[4749]: E0225 07:20:17.433626 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:18 crc kubenswrapper[4749]: I0225 07:20:18.321585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:18 crc kubenswrapper[4749]: E0225 07:20:18.321812 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:18 crc kubenswrapper[4749]: I0225 07:20:18.323027 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:20:18 crc kubenswrapper[4749]: E0225 07:20:18.323397 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-r9pzm_openshift-ovn-kubernetes(fae19e32-92e3-446f-9a38-85e8fef239dd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" Feb 25 07:20:19 crc kubenswrapper[4749]: I0225 07:20:19.321730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:19 crc kubenswrapper[4749]: I0225 07:20:19.321807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:19 crc kubenswrapper[4749]: I0225 07:20:19.321730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:19 crc kubenswrapper[4749]: E0225 07:20:19.321977 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:19 crc kubenswrapper[4749]: E0225 07:20:19.322090 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:19 crc kubenswrapper[4749]: E0225 07:20:19.322245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:20 crc kubenswrapper[4749]: I0225 07:20:20.321736 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:20 crc kubenswrapper[4749]: E0225 07:20:20.322202 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:21 crc kubenswrapper[4749]: I0225 07:20:21.322162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:21 crc kubenswrapper[4749]: I0225 07:20:21.322263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:21 crc kubenswrapper[4749]: I0225 07:20:21.322210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:21 crc kubenswrapper[4749]: E0225 07:20:21.322438 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:21 crc kubenswrapper[4749]: E0225 07:20:21.322672 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:21 crc kubenswrapper[4749]: E0225 07:20:21.322832 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:22 crc kubenswrapper[4749]: I0225 07:20:22.322432 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:22 crc kubenswrapper[4749]: E0225 07:20:22.322891 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:22 crc kubenswrapper[4749]: E0225 07:20:22.434476 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:23 crc kubenswrapper[4749]: I0225 07:20:23.324963 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:23 crc kubenswrapper[4749]: E0225 07:20:23.325083 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:23 crc kubenswrapper[4749]: I0225 07:20:23.325273 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:23 crc kubenswrapper[4749]: E0225 07:20:23.325332 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:23 crc kubenswrapper[4749]: I0225 07:20:23.325456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:23 crc kubenswrapper[4749]: E0225 07:20:23.325511 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:24 crc kubenswrapper[4749]: I0225 07:20:24.321728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:24 crc kubenswrapper[4749]: E0225 07:20:24.321983 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:25 crc kubenswrapper[4749]: I0225 07:20:25.321881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:25 crc kubenswrapper[4749]: E0225 07:20:25.322031 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:25 crc kubenswrapper[4749]: I0225 07:20:25.321881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:25 crc kubenswrapper[4749]: I0225 07:20:25.322102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:25 crc kubenswrapper[4749]: E0225 07:20:25.322169 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:25 crc kubenswrapper[4749]: E0225 07:20:25.322227 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.322389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:26 crc kubenswrapper[4749]: E0225 07:20:26.322767 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.484829 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/1.log" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.485654 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/0.log" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.485737 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c23d4e-91a8-4374-84dc-7bdc7450661d" containerID="f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5" exitCode=1 Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.485795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerDied","Data":"f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5"} Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.485882 4749 scope.go:117] "RemoveContainer" containerID="c433997a8ed184098215e0f88ceb5a9627328c74578d6e8335c6d6f7a1727bc6" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.487821 4749 scope.go:117] "RemoveContainer" containerID="f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5" Feb 25 07:20:26 crc kubenswrapper[4749]: E0225 07:20:26.488110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bkmjf_openshift-multus(21c23d4e-91a8-4374-84dc-7bdc7450661d)\"" pod="openshift-multus/multus-bkmjf" podUID="21c23d4e-91a8-4374-84dc-7bdc7450661d" Feb 25 07:20:26 crc kubenswrapper[4749]: I0225 07:20:26.523172 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lswpm" podStartSLOduration=126.523144843 podStartE2EDuration="2m6.523144843s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:11.452370199 +0000 UTC m=+164.814196279" watchObservedRunningTime="2026-02-25 07:20:26.523144843 +0000 UTC m=+179.884970893" Feb 25 07:20:27 crc kubenswrapper[4749]: I0225 07:20:27.322087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:27 crc kubenswrapper[4749]: I0225 07:20:27.322140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:27 crc kubenswrapper[4749]: I0225 07:20:27.322323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:27 crc kubenswrapper[4749]: E0225 07:20:27.323929 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:27 crc kubenswrapper[4749]: E0225 07:20:27.324071 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:27 crc kubenswrapper[4749]: E0225 07:20:27.324225 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:27 crc kubenswrapper[4749]: E0225 07:20:27.435969 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:27 crc kubenswrapper[4749]: I0225 07:20:27.491307 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/1.log" Feb 25 07:20:28 crc kubenswrapper[4749]: I0225 07:20:28.321995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:28 crc kubenswrapper[4749]: E0225 07:20:28.322188 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:29 crc kubenswrapper[4749]: I0225 07:20:29.321730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:29 crc kubenswrapper[4749]: E0225 07:20:29.321895 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:29 crc kubenswrapper[4749]: I0225 07:20:29.321730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:29 crc kubenswrapper[4749]: I0225 07:20:29.321972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:29 crc kubenswrapper[4749]: E0225 07:20:29.322078 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:29 crc kubenswrapper[4749]: E0225 07:20:29.322214 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:30 crc kubenswrapper[4749]: I0225 07:20:30.321613 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:30 crc kubenswrapper[4749]: E0225 07:20:30.321759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:31 crc kubenswrapper[4749]: I0225 07:20:31.321695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:31 crc kubenswrapper[4749]: I0225 07:20:31.321838 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:31 crc kubenswrapper[4749]: E0225 07:20:31.322005 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:31 crc kubenswrapper[4749]: I0225 07:20:31.322071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:31 crc kubenswrapper[4749]: E0225 07:20:31.322184 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:31 crc kubenswrapper[4749]: E0225 07:20:31.322309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:32 crc kubenswrapper[4749]: I0225 07:20:32.322151 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:32 crc kubenswrapper[4749]: E0225 07:20:32.322270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:32 crc kubenswrapper[4749]: E0225 07:20:32.437246 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:33 crc kubenswrapper[4749]: I0225 07:20:33.322254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:33 crc kubenswrapper[4749]: I0225 07:20:33.322355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:33 crc kubenswrapper[4749]: I0225 07:20:33.322484 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:33 crc kubenswrapper[4749]: E0225 07:20:33.322479 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:33 crc kubenswrapper[4749]: E0225 07:20:33.323264 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:33 crc kubenswrapper[4749]: E0225 07:20:33.323384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:33 crc kubenswrapper[4749]: I0225 07:20:33.323904 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:20:34 crc kubenswrapper[4749]: I0225 07:20:34.321785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:34 crc kubenswrapper[4749]: E0225 07:20:34.322324 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:34 crc kubenswrapper[4749]: I0225 07:20:34.516657 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/3.log" Feb 25 07:20:34 crc kubenswrapper[4749]: I0225 07:20:34.520853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerStarted","Data":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:20:34 crc kubenswrapper[4749]: I0225 07:20:34.522572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:20:35 crc kubenswrapper[4749]: I0225 07:20:35.302698 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podStartSLOduration=135.302663915 podStartE2EDuration="2m15.302663915s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:34.576281032 +0000 UTC m=+187.938107112" watchObservedRunningTime="2026-02-25 07:20:35.302663915 +0000 UTC m=+188.664489975" Feb 25 07:20:35 crc kubenswrapper[4749]: I0225 07:20:35.303449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h66ds"] Feb 25 07:20:35 crc kubenswrapper[4749]: I0225 07:20:35.303574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:35 crc kubenswrapper[4749]: E0225 07:20:35.303770 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:35 crc kubenswrapper[4749]: I0225 07:20:35.322198 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:35 crc kubenswrapper[4749]: I0225 07:20:35.322258 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:35 crc kubenswrapper[4749]: E0225 07:20:35.322418 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:35 crc kubenswrapper[4749]: E0225 07:20:35.322635 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:36 crc kubenswrapper[4749]: I0225 07:20:36.321628 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:36 crc kubenswrapper[4749]: E0225 07:20:36.321852 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:37 crc kubenswrapper[4749]: I0225 07:20:37.321383 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:37 crc kubenswrapper[4749]: I0225 07:20:37.321449 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:37 crc kubenswrapper[4749]: I0225 07:20:37.321494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:37 crc kubenswrapper[4749]: E0225 07:20:37.323410 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:37 crc kubenswrapper[4749]: E0225 07:20:37.323573 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:37 crc kubenswrapper[4749]: E0225 07:20:37.323701 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:37 crc kubenswrapper[4749]: E0225 07:20:37.438508 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:20:38 crc kubenswrapper[4749]: I0225 07:20:38.321630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:38 crc kubenswrapper[4749]: E0225 07:20:38.322044 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:38 crc kubenswrapper[4749]: I0225 07:20:38.322138 4749 scope.go:117] "RemoveContainer" containerID="f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5" Feb 25 07:20:38 crc kubenswrapper[4749]: I0225 07:20:38.548584 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/1.log" Feb 25 07:20:38 crc kubenswrapper[4749]: I0225 07:20:38.548705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerStarted","Data":"87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88"} Feb 25 07:20:39 crc kubenswrapper[4749]: I0225 07:20:39.322349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:39 crc kubenswrapper[4749]: I0225 07:20:39.322509 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:39 crc kubenswrapper[4749]: E0225 07:20:39.322748 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:39 crc kubenswrapper[4749]: I0225 07:20:39.322793 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:39 crc kubenswrapper[4749]: E0225 07:20:39.322981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:39 crc kubenswrapper[4749]: E0225 07:20:39.323133 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:40 crc kubenswrapper[4749]: I0225 07:20:40.321687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:40 crc kubenswrapper[4749]: E0225 07:20:40.321874 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:41 crc kubenswrapper[4749]: I0225 07:20:41.321893 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:41 crc kubenswrapper[4749]: I0225 07:20:41.321919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:41 crc kubenswrapper[4749]: E0225 07:20:41.322173 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 25 07:20:41 crc kubenswrapper[4749]: E0225 07:20:41.322334 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h66ds" podUID="33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2" Feb 25 07:20:41 crc kubenswrapper[4749]: I0225 07:20:41.322749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:41 crc kubenswrapper[4749]: E0225 07:20:41.322915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 25 07:20:42 crc kubenswrapper[4749]: I0225 07:20:42.321688 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:42 crc kubenswrapper[4749]: E0225 07:20:42.321881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.322418 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.322460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.322556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.325300 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.325324 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.325376 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.325411 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.325385 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 07:20:43 crc kubenswrapper[4749]: I0225 07:20:43.327676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 07:20:44 crc kubenswrapper[4749]: I0225 07:20:44.440206 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.800895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.852873 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.853678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.856995 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mqgw7"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.858242 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.858529 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.858966 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.859382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.859493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.862698 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.863411 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.863660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.865205 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.871011 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7fnnb"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.877186 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: W0225 07:20:50.877520 4749 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 25 07:20:50 crc kubenswrapper[4749]: E0225 07:20:50.877821 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.878712 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.879911 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.881654 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.882108 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.882193 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.882570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.882873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.883205 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.883441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.883671 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.883863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.884815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.885664 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.885891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886058 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.885899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886323 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886458 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886881 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.886878 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.890002 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.890494 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.890851 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.893120 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.894276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.896207 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2scs"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.896780 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.911705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pzj5d"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.912759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.916277 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.916786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.916859 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.921429 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.930583 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.931735 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.931912 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932228 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932429 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932055 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.932231 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.936639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.938219 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.939444 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.939759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.944286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.944843 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.945645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946309 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946474 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946678 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.946971 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.947009 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.947134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.947169 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.947269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.949038 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.957426 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.959064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.959304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.959402 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.959882 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.960576 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.960770 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.960985 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.961021 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.961193 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.961304 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.962054 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.962271 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.962407 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963364 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963449 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963619 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963742 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963760 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963886 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.963919 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.964011 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.964052 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.964169 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.969684 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.969860 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.970001 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.973367 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.975854 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.977579 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.978147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.978450 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jwdk8"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.978557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.978568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.978891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.980181 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.980570 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg"] Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.994224 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.995886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.995944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswg6\" (UniqueName: \"kubernetes.io/projected/1209de54-9bcd-424b-836c-77a8b90e494f-kube-api-access-fswg6\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.995981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-audit-dir\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phjz\" (UniqueName: \"kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e416b4-b82c-46d1-a612-9736b8e6db14-serving-cert\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnt5\" (UniqueName: \"kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/723f3a22-e0cb-4b03-952d-7f4e6aece976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-trusted-ca\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996304 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-config\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996318 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-client\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj992\" (UniqueName: \"kubernetes.io/projected/837b8ba9-ad4e-4186-be8a-60807351bf87-kube-api-access-dj992\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-auth-proxy-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-client\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-serving-cert\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-encryption-config\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5jkz\" (UniqueName: \"kubernetes.io/projected/a0e416b4-b82c-46d1-a612-9736b8e6db14-kube-api-access-j5jkz\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.996767 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5l2j\" (UniqueName: \"kubernetes.io/projected/723f3a22-e0cb-4b03-952d-7f4e6aece976-kube-api-access-c5l2j\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-serving-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-encryption-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-serving-cert\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1209de54-9bcd-424b-836c-77a8b90e494f-audit-dir\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfp7z\" (UniqueName: \"kubernetes.io/projected/34230523-16b1-4cba-8dfe-bfd3f0127944-kube-api-access-rfp7z\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997652 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-image-import-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:50 crc kubenswrapper[4749]: I0225 07:20:50.997700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34230523-16b1-4cba-8dfe-bfd3f0127944-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.000927 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.009853 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.001656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cztt\" (UniqueName: \"kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-audit\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-service-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010136 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-config\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-images\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gdls\" (UniqueName: \"kubernetes.io/projected/5eee50b2-fb76-4a78-a3f9-b364421a4178-kube-api-access-8gdls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f31545-040d-4d7a-8466-e0eaca760b03-serving-cert\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-config\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-node-pullsecrets\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48b2f\" (UniqueName: \"kubernetes.io/projected/e7f31545-040d-4d7a-8466-e0eaca760b03-kube-api-access-48b2f\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-audit-policies\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34230523-16b1-4cba-8dfe-bfd3f0127944-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.010616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5eee50b2-fb76-4a78-a3f9-b364421a4178-machine-approver-tls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.011033 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013048 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013170 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013551 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.013694 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfrvr"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.014452 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.014527 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.014967 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.015376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.015572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.018661 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.019303 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lhs4d"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.019722 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zlfns"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.020148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.020362 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.020467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.020518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.028330 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.028469 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.028582 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.030952 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031251 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031588 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031978 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7fnnb"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031988 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mqgw7"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.031996 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032069 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032229 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032379 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032410 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032006 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.032869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.033085 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8pv62"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.033404 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.033456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.033520 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.033745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.034741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.036527 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.037190 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.040832 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.043102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.043854 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044229 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jscrf"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044567 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533400-6w2xx"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044951 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wxblp"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.044959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.045137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.045224 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.045831 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.046177 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.046546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.049700 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.050355 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.050464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.051234 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.058524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.068335 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.071417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.071500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pzj5d"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.071583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.073496 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.074959 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.081792 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.082063 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.088503 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qwfdr"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.093436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.094858 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.095772 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.096664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfrvr"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.097898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.101988 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.102168 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jwdk8"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.102758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.104033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2scs"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.105035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lhs4d"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/723f3a22-e0cb-4b03-952d-7f4e6aece976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-trusted-ca\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111489 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-config\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-client\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5228e90-419a-4be6-a930-5391b35bee6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qrd\" (UniqueName: \"kubernetes.io/projected/30e28bec-ff89-41e7-93c9-da957325acb0-kube-api-access-48qrd\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-auth-proxy-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-client\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.111675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj992\" (UniqueName: \"kubernetes.io/projected/837b8ba9-ad4e-4186-be8a-60807351bf87-kube-api-access-dj992\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-serving-cert\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-encryption-config\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jkz\" (UniqueName: \"kubernetes.io/projected/a0e416b4-b82c-46d1-a612-9736b8e6db14-kube-api-access-j5jkz\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5l2j\" (UniqueName: \"kubernetes.io/projected/723f3a22-e0cb-4b03-952d-7f4e6aece976-kube-api-access-c5l2j\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-serving-cert\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1209de54-9bcd-424b-836c-77a8b90e494f-audit-dir\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-serving-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-encryption-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfp7z\" (UniqueName: \"kubernetes.io/projected/34230523-16b1-4cba-8dfe-bfd3f0127944-kube-api-access-rfp7z\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-image-import-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cztt\" (UniqueName: \"kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34230523-16b1-4cba-8dfe-bfd3f0127944-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s995t\" (UniqueName: \"kubernetes.io/projected/197fe25d-5e80-4fd6-84a2-3596e19e3703-kube-api-access-s995t\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-service-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-audit\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5228e90-419a-4be6-a930-5391b35bee6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-config\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gdls\" (UniqueName: \"kubernetes.io/projected/5eee50b2-fb76-4a78-a3f9-b364421a4178-kube-api-access-8gdls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-images\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f31545-040d-4d7a-8466-e0eaca760b03-serving-cert\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-config\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-node-pullsecrets\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e28bec-ff89-41e7-93c9-da957325acb0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-audit-policies\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34230523-16b1-4cba-8dfe-bfd3f0127944-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48b2f\" (UniqueName: \"kubernetes.io/projected/e7f31545-040d-4d7a-8466-e0eaca760b03-kube-api-access-48b2f\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5eee50b2-fb76-4a78-a3f9-b364421a4178-machine-approver-tls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswg6\" (UniqueName: \"kubernetes.io/projected/1209de54-9bcd-424b-836c-77a8b90e494f-kube-api-access-fswg6\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-audit-dir\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-trusted-ca\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phjz\" (UniqueName: \"kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e416b4-b82c-46d1-a612-9736b8e6db14-serving-cert\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5228e90-419a-4be6-a930-5391b35bee6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnt5\" (UniqueName: \"kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197fe25d-5e80-4fd6-84a2-3596e19e3703-metrics-tls\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-auth-proxy-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.113846 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.116755 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9jt7w"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.116775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-images\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34230523-16b1-4cba-8dfe-bfd3f0127944-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-serving-cert\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-client\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.112505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/723f3a22-e0cb-4b03-952d-7f4e6aece976-config\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1209de54-9bcd-424b-836c-77a8b90e494f-audit-dir\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.117824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.118239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-etcd-serving-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.118354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.118469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.118512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.118555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.119192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-etcd-client\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.119433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-audit\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.119534 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.119823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-encryption-config\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.119950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-config\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.120240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.120808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/837b8ba9-ad4e-4186-be8a-60807351bf87-encryption-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.121259 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e416b4-b82c-46d1-a612-9736b8e6db14-service-ca-bundle\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.121421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.121604 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rh464"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.121783 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.122230 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.122327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.123099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.123168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-image-import-ca\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.123378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.123535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.123587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.124170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/723f3a22-e0cb-4b03-952d-7f4e6aece976-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7f31545-040d-4d7a-8466-e0eaca760b03-config\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125226 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-audit-dir\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/837b8ba9-ad4e-4186-be8a-60807351bf87-node-pullsecrets\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e416b4-b82c-46d1-a612-9736b8e6db14-serving-cert\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.125901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f31545-040d-4d7a-8466-e0eaca760b03-serving-cert\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eee50b2-fb76-4a78-a3f9-b364421a4178-config\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1209de54-9bcd-424b-836c-77a8b90e494f-audit-policies\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1209de54-9bcd-424b-836c-77a8b90e494f-serving-cert\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.126939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/837b8ba9-ad4e-4186-be8a-60807351bf87-config\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.127022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.127072 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.128181 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.128364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.128797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34230523-16b1-4cba-8dfe-bfd3f0127944-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.129306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8pv62"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.130122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5eee50b2-fb76-4a78-a3f9-b364421a4178-machine-approver-tls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.130257 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.131273 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.132383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.133366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.134383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qwfdr"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.135414 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.137280 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.138726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.140101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533400-6w2xx"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.141243 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9jt7w"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.142834 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jscrf"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.143224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.144050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wxblp"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.145435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.146180 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8jn8v"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.146818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.148818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jn8v"] Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.155639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.175558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.196279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s995t\" (UniqueName: \"kubernetes.io/projected/197fe25d-5e80-4fd6-84a2-3596e19e3703-kube-api-access-s995t\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5228e90-419a-4be6-a930-5391b35bee6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e28bec-ff89-41e7-93c9-da957325acb0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5228e90-419a-4be6-a930-5391b35bee6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197fe25d-5e80-4fd6-84a2-3596e19e3703-metrics-tls\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5228e90-419a-4be6-a930-5391b35bee6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.214869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qrd\" (UniqueName: \"kubernetes.io/projected/30e28bec-ff89-41e7-93c9-da957325acb0-kube-api-access-48qrd\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.215318 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.220049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/197fe25d-5e80-4fd6-84a2-3596e19e3703-metrics-tls\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.220261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5228e90-419a-4be6-a930-5391b35bee6b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.225924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5228e90-419a-4be6-a930-5391b35bee6b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.237986 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.241224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/30e28bec-ff89-41e7-93c9-da957325acb0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.256100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.295501 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.316352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.336514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.356325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.375650 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.395571 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.415694 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.438082 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.456309 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.475773 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.496459 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.516979 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.536529 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.555917 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.577148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.596471 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.617500 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.637163 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.656479 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.676947 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.696836 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.716297 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.736930 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.765550 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.776243 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.796952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.805495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.817361 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.837702 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.857228 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.876979 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.896891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.916465 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.937086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.956430 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.976386 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 07:20:51 crc kubenswrapper[4749]: I0225 07:20:51.997735 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.017318 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.034372 4749 request.go:700] Waited for 1.000529416s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.045652 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.057270 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.076346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.096876 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.116670 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.136206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.156572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.176566 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.197128 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.216455 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.237579 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.256956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.276573 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.296141 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.317043 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.336924 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.358276 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.377436 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.396715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.417046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.437083 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.457031 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.476647 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.496252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.517693 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.537129 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.557619 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.576185 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.596633 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.617274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.636834 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.656900 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.696638 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.716676 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.737293 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.757061 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.776566 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.798342 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.817119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.864854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jkz\" (UniqueName: \"kubernetes.io/projected/a0e416b4-b82c-46d1-a612-9736b8e6db14-kube-api-access-j5jkz\") pod \"authentication-operator-69f744f599-pzj5d\" (UID: \"a0e416b4-b82c-46d1-a612-9736b8e6db14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.904715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj992\" (UniqueName: \"kubernetes.io/projected/837b8ba9-ad4e-4186-be8a-60807351bf87-kube-api-access-dj992\") pod \"apiserver-76f77b778f-7fnnb\" (UID: \"837b8ba9-ad4e-4186-be8a-60807351bf87\") " pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.912161 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.918010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.925350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cztt\" (UniqueName: \"kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt\") pod \"controller-manager-879f6c89f-x5ctv\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.937101 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.956875 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 07:20:52 crc kubenswrapper[4749]: I0225 07:20:52.998103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfp7z\" (UniqueName: \"kubernetes.io/projected/34230523-16b1-4cba-8dfe-bfd3f0127944-kube-api-access-rfp7z\") pod \"openshift-apiserver-operator-796bbdcf4f-qqx6g\" (UID: \"34230523-16b1-4cba-8dfe-bfd3f0127944\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.023396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gdls\" (UniqueName: \"kubernetes.io/projected/5eee50b2-fb76-4a78-a3f9-b364421a4178-kube-api-access-8gdls\") pod \"machine-approver-56656f9798-jrlsf\" (UID: \"5eee50b2-fb76-4a78-a3f9-b364421a4178\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.034628 4749 request.go:700] Waited for 1.912062616s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.038940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phjz\" (UniqueName: \"kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz\") pod \"route-controller-manager-6576b87f9c-fnj7g\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.042117 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.056565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.076869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.096450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.119402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48b2f\" (UniqueName: \"kubernetes.io/projected/e7f31545-040d-4d7a-8466-e0eaca760b03-kube-api-access-48b2f\") pod \"console-operator-58897d9998-w2scs\" (UID: \"e7f31545-040d-4d7a-8466-e0eaca760b03\") " pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.137306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.140333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnt5\" (UniqueName: \"kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5\") pod \"oauth-openshift-558db77b4-cdjg8\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.151726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.157101 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.166712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswg6\" (UniqueName: \"kubernetes.io/projected/1209de54-9bcd-424b-836c-77a8b90e494f-kube-api-access-fswg6\") pod \"apiserver-7bbb656c7d-rl7ff\" (UID: \"1209de54-9bcd-424b-836c-77a8b90e494f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.170473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.175841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.183114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pzj5d"] Feb 25 07:20:53 crc kubenswrapper[4749]: W0225 07:20:53.186344 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eee50b2_fb76_4a78_a3f9_b364421a4178.slice/crio-af86a6852733fb5735ef88d9faacac0580aa1b2f5cc7ae6d127f60601df5180c WatchSource:0}: Error finding container af86a6852733fb5735ef88d9faacac0580aa1b2f5cc7ae6d127f60601df5180c: Status 404 returned error can't find the container with id af86a6852733fb5735ef88d9faacac0580aa1b2f5cc7ae6d127f60601df5180c Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.194035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.195873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 07:20:53 crc kubenswrapper[4749]: W0225 07:20:53.198651 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e416b4_b82c_46d1_a612_9736b8e6db14.slice/crio-5f480f967cbaaf24ed5cb9f8f9bba9b896ac11aa55e6e64439cfef7dbf21af56 WatchSource:0}: Error finding container 5f480f967cbaaf24ed5cb9f8f9bba9b896ac11aa55e6e64439cfef7dbf21af56: Status 404 returned error can't find the container with id 5f480f967cbaaf24ed5cb9f8f9bba9b896ac11aa55e6e64439cfef7dbf21af56 Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.202937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.239725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s995t\" (UniqueName: \"kubernetes.io/projected/197fe25d-5e80-4fd6-84a2-3596e19e3703-kube-api-access-s995t\") pod \"dns-operator-744455d44c-rfrvr\" (UID: \"197fe25d-5e80-4fd6-84a2-3596e19e3703\") " pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.263977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qrd\" (UniqueName: \"kubernetes.io/projected/30e28bec-ff89-41e7-93c9-da957325acb0-kube-api-access-48qrd\") pod \"cluster-samples-operator-665b6dd947-tm8jc\" (UID: \"30e28bec-ff89-41e7-93c9-da957325acb0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.269825 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.282258 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5228e90-419a-4be6-a930-5391b35bee6b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wqfqj\" (UID: \"d5228e90-419a-4be6-a930-5391b35bee6b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.296011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.307570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5l2j\" (UniqueName: \"kubernetes.io/projected/723f3a22-e0cb-4b03-952d-7f4e6aece976-kube-api-access-c5l2j\") pod \"machine-api-operator-5694c8668f-mqgw7\" (UID: \"723f3a22-e0cb-4b03-952d-7f4e6aece976\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.307729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.316143 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.326816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a062daf8-94d8-4c83-a020-ab3a8c86b752-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8fr2\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-kube-api-access-b8fr2\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a12af2-1536-48da-9867-3eb65b8c82cb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.348943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ee8334-f4bb-440d-831d-52339abe2875-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7lpx\" (UniqueName: \"kubernetes.io/projected/a062daf8-94d8-4c83-a020-ab3a8c86b752-kube-api-access-j7lpx\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcgh6\" (UniqueName: \"kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a12af2-1536-48da-9867-3eb65b8c82cb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntc9\" (UniqueName: \"kubernetes.io/projected/26ee8334-f4bb-440d-831d-52339abe2875-kube-api-access-2ntc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24p9k\" (UniqueName: \"kubernetes.io/projected/b699df8e-e341-4be3-9c4c-04e3b13d2737-kube-api-access-24p9k\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qnl\" (UniqueName: \"kubernetes.io/projected/3f34750e-27ea-4d28-b6a9-0f65d2b29e92-kube-api-access-48qnl\") pod \"downloads-7954f5f757-jwdk8\" (UID: \"3f34750e-27ea-4d28-b6a9-0f65d2b29e92\") " pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a062daf8-94d8-4c83-a020-ab3a8c86b752-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b699df8e-e341-4be3-9c4c-04e3b13d2737-serving-cert\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b699df8e-e341-4be3-9c4c-04e3b13d2737-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5vmh\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ee8334-f4bb-440d-831d-52339abe2875-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.349674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.351847 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:53.851826106 +0000 UTC m=+207.213652126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.355306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.362156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.369346 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7fnnb"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.407992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:20:53 crc kubenswrapper[4749]: W0225 07:20:53.428257 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod837b8ba9_ad4e_4186_be8a_60807351bf87.slice/crio-2a9ea25782240a49504820a6eb0cc6f479528b718fb5c82f3c8fde8553483644 WatchSource:0}: Error finding container 2a9ea25782240a49504820a6eb0cc6f479528b718fb5c82f3c8fde8553483644: Status 404 returned error can't find the container with id 2a9ea25782240a49504820a6eb0cc6f479528b718fb5c82f3c8fde8553483644 Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.450183 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:53.950157618 +0000 UTC m=+207.311983638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450229 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/d7a030e0-e24c-4c03-a1f0-0840c7eec829-kube-api-access-qndcs\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8fr2\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-kube-api-access-b8fr2\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxz7\" (UniqueName: \"kubernetes.io/projected/9c785750-976e-4ed9-a9dc-e8df35faeb94-kube-api-access-fdxz7\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-node-bootstrap-token\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-tmpfs\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfw4d\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-kube-api-access-nfw4d\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjxjc\" (UniqueName: \"kubernetes.io/projected/94a0babc-26c5-4544-8279-449668534ba2-kube-api-access-hjxjc\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nfl\" (UniqueName: \"kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cq2h\" (UniqueName: \"kubernetes.io/projected/293d7797-862d-4cbf-ac7f-16ef038ac6aa-kube-api-access-7cq2h\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdd9c623-4f99-434b-9142-374f3798a39e-config\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ee8334-f4bb-440d-831d-52339abe2875-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-images\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcgh6\" (UniqueName: \"kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.450522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-cabundle\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxr26\" (UniqueName: \"kubernetes.io/projected/85efff27-fc96-4191-9733-a6a2434a723c-kube-api-access-hxr26\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24p9k\" (UniqueName: \"kubernetes.io/projected/b699df8e-e341-4be3-9c4c-04e3b13d2737-kube-api-access-24p9k\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qnl\" (UniqueName: \"kubernetes.io/projected/3f34750e-27ea-4d28-b6a9-0f65d2b29e92-kube-api-access-48qnl\") pod \"downloads-7954f5f757-jwdk8\" (UID: \"3f34750e-27ea-4d28-b6a9-0f65d2b29e92\") " pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-srv-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451489 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b699df8e-e341-4be3-9c4c-04e3b13d2737-serving-cert\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-webhook-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptzl\" (UniqueName: \"kubernetes.io/projected/61f0238b-c399-46bc-9f9b-c74d5310db41-kube-api-access-4ptzl\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78x26\" (UniqueName: \"kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451962 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85efff27-fc96-4191-9733-a6a2434a723c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-config-volume\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.451995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84pr\" (UniqueName: \"kubernetes.io/projected/754eba16-66a2-4f45-a7b0-77859b76c469-kube-api-access-r84pr\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b699df8e-e341-4be3-9c4c-04e3b13d2737-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-proxy-tls\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5a433bc-0ca9-4a65-8df6-b065c131f20c-metrics-tls\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5a433bc-0ca9-4a65-8df6-b065c131f20c-trusted-ca\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ee8334-f4bb-440d-831d-52339abe2875-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2834ef-28bc-4393-9acf-f78292d07b41-service-ca-bundle\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdt8z\" (UniqueName: \"kubernetes.io/projected/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-kube-api-access-hdt8z\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdd9c623-4f99-434b-9142-374f3798a39e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-profile-collector-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-metrics-tls\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a062daf8-94d8-4c83-a020-ab3a8c86b752-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnk94\" (UniqueName: \"kubernetes.io/projected/44a7670a-caef-4c90-ac73-c3d959b5b312-kube-api-access-vnk94\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452283 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ddf1184-18c2-4665-8198-0422d2bfd272-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-srv-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f986db0d-78ce-4c23-8b36-acb7eddecfd7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flz6x\" (UniqueName: \"kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x\") pod \"auto-csr-approver-29533400-6w2xx\" (UID: \"2a8700de-dc40-4245-8c99-e792c342b5bb\") " pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rnj\" (UniqueName: \"kubernetes.io/projected/055a78a2-9ef0-47da-a1b4-3528b2370dd1-kube-api-access-d6rnj\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-mountpoint-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-certs\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b699df8e-e341-4be3-9c4c-04e3b13d2737-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a12af2-1536-48da-9867-3eb65b8c82cb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-stats-auth\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/293d7797-862d-4cbf-ac7f-16ef038ac6aa-config\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdd9c623-4f99-434b-9142-374f3798a39e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/293d7797-862d-4cbf-ac7f-16ef038ac6aa-serving-cert\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ddf1184-18c2-4665-8198-0422d2bfd272-config\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-service-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfmz\" (UniqueName: \"kubernetes.io/projected/743f91d6-aff7-4cc8-92d4-41245a3ffa9a-kube-api-access-lnfmz\") pod \"migrator-59844c95c7-9xjkb\" (UID: \"743f91d6-aff7-4cc8-92d4-41245a3ffa9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-config\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-plugins-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqfz4\" (UniqueName: \"kubernetes.io/projected/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-kube-api-access-tqfz4\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/754eba16-66a2-4f45-a7b0-77859b76c469-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f986db0d-78ce-4c23-8b36-acb7eddecfd7-proxy-tls\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7lpx\" (UniqueName: \"kubernetes.io/projected/a062daf8-94d8-4c83-a020-ab3a8c86b752-kube-api-access-j7lpx\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdkql\" (UniqueName: \"kubernetes.io/projected/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-kube-api-access-bdkql\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br7hq\" (UniqueName: \"kubernetes.io/projected/3e2f1545-81aa-4204-bbcf-333ccd9c000e-kube-api-access-br7hq\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a12af2-1536-48da-9867-3eb65b8c82cb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-csi-data-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntc9\" (UniqueName: \"kubernetes.io/projected/26ee8334-f4bb-440d-831d-52339abe2875-kube-api-access-2ntc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f74503e4-ab5f-4eea-a47a-cc961035d3b7-cert\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-client\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a062daf8-94d8-4c83-a020-ab3a8c86b752-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.452986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-socket-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwh5\" (UniqueName: \"kubernetes.io/projected/f74503e4-ab5f-4eea-a47a-cc961035d3b7-kube-api-access-pmwh5\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-key\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmtm\" (UniqueName: \"kubernetes.io/projected/6a2834ef-28bc-4393-9acf-f78292d07b41-kube-api-access-vhmtm\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmp2\" (UniqueName: \"kubernetes.io/projected/f986db0d-78ce-4c23-8b36-acb7eddecfd7-kube-api-access-vvmp2\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-registration-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-default-certificate\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/055a78a2-9ef0-47da-a1b4-3528b2370dd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddf1184-18c2-4665-8198-0422d2bfd272-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-metrics-certs\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5vmh\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.453301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-serving-cert\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.454080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ee8334-f4bb-440d-831d-52339abe2875-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.455037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.455357 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:53.95534203 +0000 UTC m=+207.317168050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.456855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.456860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b699df8e-e341-4be3-9c4c-04e3b13d2737-serving-cert\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.457154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a062daf8-94d8-4c83-a020-ab3a8c86b752-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.479355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.479408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a12af2-1536-48da-9867-3eb65b8c82cb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a062daf8-94d8-4c83-a020-ab3a8c86b752-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98a12af2-1536-48da-9867-3eb65b8c82cb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.480813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26ee8334-f4bb-440d-831d-52339abe2875-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.484047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.487538 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.488172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.507948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8fr2\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-kube-api-access-b8fr2\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: W0225 07:20:53.514151 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee70ad86_2c43_4d87_9465_9c020b5b4cec.slice/crio-34c4ddb7f564c96592f48cf93c71566586ed6181ca5ed4b943f81465f661ea25 WatchSource:0}: Error finding container 34c4ddb7f564c96592f48cf93c71566586ed6181ca5ed4b943f81465f661ea25: Status 404 returned error can't find the container with id 34c4ddb7f564c96592f48cf93c71566586ed6181ca5ed4b943f81465f661ea25 Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.537184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24p9k\" (UniqueName: \"kubernetes.io/projected/b699df8e-e341-4be3-9c4c-04e3b13d2737-kube-api-access-24p9k\") pod \"openshift-config-operator-7777fb866f-hbl4p\" (UID: \"b699df8e-e341-4be3-9c4c-04e3b13d2737\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.542164 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.547509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcgh6\" (UniqueName: \"kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6\") pod \"console-f9d7485db-wmg6x\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-config\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-plugins-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqfz4\" (UniqueName: \"kubernetes.io/projected/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-kube-api-access-tqfz4\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/754eba16-66a2-4f45-a7b0-77859b76c469-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f986db0d-78ce-4c23-8b36-acb7eddecfd7-proxy-tls\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdkql\" (UniqueName: \"kubernetes.io/projected/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-kube-api-access-bdkql\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br7hq\" (UniqueName: \"kubernetes.io/projected/3e2f1545-81aa-4204-bbcf-333ccd9c000e-kube-api-access-br7hq\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-csi-data-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f74503e4-ab5f-4eea-a47a-cc961035d3b7-cert\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555449 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-client\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-socket-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwh5\" (UniqueName: \"kubernetes.io/projected/f74503e4-ab5f-4eea-a47a-cc961035d3b7-kube-api-access-pmwh5\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmtm\" (UniqueName: \"kubernetes.io/projected/6a2834ef-28bc-4393-9acf-f78292d07b41-kube-api-access-vhmtm\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-key\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-registration-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmp2\" (UniqueName: \"kubernetes.io/projected/f986db0d-78ce-4c23-8b36-acb7eddecfd7-kube-api-access-vvmp2\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-default-certificate\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/055a78a2-9ef0-47da-a1b4-3528b2370dd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddf1184-18c2-4665-8198-0422d2bfd272-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-metrics-certs\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-serving-cert\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/d7a030e0-e24c-4c03-a1f0-0840c7eec829-kube-api-access-qndcs\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxz7\" (UniqueName: \"kubernetes.io/projected/9c785750-976e-4ed9-a9dc-e8df35faeb94-kube-api-access-fdxz7\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-node-bootstrap-token\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-tmpfs\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfw4d\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-kube-api-access-nfw4d\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjxjc\" (UniqueName: \"kubernetes.io/projected/94a0babc-26c5-4544-8279-449668534ba2-kube-api-access-hjxjc\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nfl\" (UniqueName: \"kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cq2h\" (UniqueName: \"kubernetes.io/projected/293d7797-862d-4cbf-ac7f-16ef038ac6aa-kube-api-access-7cq2h\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdd9c623-4f99-434b-9142-374f3798a39e-config\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555865 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-images\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-cabundle\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxr26\" (UniqueName: \"kubernetes.io/projected/85efff27-fc96-4191-9733-a6a2434a723c-kube-api-access-hxr26\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-srv-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.555978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-webhook-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78x26\" (UniqueName: \"kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptzl\" (UniqueName: \"kubernetes.io/projected/61f0238b-c399-46bc-9f9b-c74d5310db41-kube-api-access-4ptzl\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85efff27-fc96-4191-9733-a6a2434a723c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-config-volume\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r84pr\" (UniqueName: \"kubernetes.io/projected/754eba16-66a2-4f45-a7b0-77859b76c469-kube-api-access-r84pr\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-proxy-tls\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5a433bc-0ca9-4a65-8df6-b065c131f20c-metrics-tls\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5a433bc-0ca9-4a65-8df6-b065c131f20c-trusted-ca\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2834ef-28bc-4393-9acf-f78292d07b41-service-ca-bundle\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdt8z\" (UniqueName: \"kubernetes.io/projected/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-kube-api-access-hdt8z\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdd9c623-4f99-434b-9142-374f3798a39e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-profile-collector-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-metrics-tls\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnk94\" (UniqueName: \"kubernetes.io/projected/44a7670a-caef-4c90-ac73-c3d959b5b312-kube-api-access-vnk94\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ddf1184-18c2-4665-8198-0422d2bfd272-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flz6x\" (UniqueName: \"kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x\") pod \"auto-csr-approver-29533400-6w2xx\" (UID: \"2a8700de-dc40-4245-8c99-e792c342b5bb\") " pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556276 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-srv-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f986db0d-78ce-4c23-8b36-acb7eddecfd7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rnj\" (UniqueName: \"kubernetes.io/projected/055a78a2-9ef0-47da-a1b4-3528b2370dd1-kube-api-access-d6rnj\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-mountpoint-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-certs\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-stats-auth\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/293d7797-862d-4cbf-ac7f-16ef038ac6aa-config\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdd9c623-4f99-434b-9142-374f3798a39e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/293d7797-862d-4cbf-ac7f-16ef038ac6aa-serving-cert\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfmz\" (UniqueName: \"kubernetes.io/projected/743f91d6-aff7-4cc8-92d4-41245a3ffa9a-kube-api-access-lnfmz\") pod \"migrator-59844c95c7-9xjkb\" (UID: \"743f91d6-aff7-4cc8-92d4-41245a3ffa9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ddf1184-18c2-4665-8198-0422d2bfd272-config\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.556472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-service-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.557516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-service-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.557976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdd9c623-4f99-434b-9142-374f3798a39e-config\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.558023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-images\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.558068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-registration-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.558179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-socket-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.559427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-config\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.559496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-plugins-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.561889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a2834ef-28bc-4393-9acf-f78292d07b41-service-ca-bundle\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.564430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f986db0d-78ce-4c23-8b36-acb7eddecfd7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.565553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-default-certificate\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.568532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.569766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-srv-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.570110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f986db0d-78ce-4c23-8b36-acb7eddecfd7-proxy-tls\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.573328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-srv-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.574018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qnl\" (UniqueName: \"kubernetes.io/projected/3f34750e-27ea-4d28-b6a9-0f65d2b29e92-kube-api-access-48qnl\") pod \"downloads-7954f5f757-jwdk8\" (UID: \"3f34750e-27ea-4d28-b6a9-0f65d2b29e92\") " pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.574193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-mountpoint-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.575074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-tmpfs\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.575732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-node-bootstrap-token\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.576203 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.076177698 +0000 UTC m=+207.438003718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.577799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85efff27-fc96-4191-9733-a6a2434a723c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.578260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3e2f1545-81aa-4204-bbcf-333ccd9c000e-csi-data-dir\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.578687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ddf1184-18c2-4665-8198-0422d2bfd272-config\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.579051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-cabundle\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.579070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.579541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5a433bc-0ca9-4a65-8df6-b065c131f20c-metrics-tls\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9c785750-976e-4ed9-a9dc-e8df35faeb94-signing-key\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/293d7797-862d-4cbf-ac7f-16ef038ac6aa-config\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-metrics-tls\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ddf1184-18c2-4665-8198-0422d2bfd272-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.581814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/94a0babc-26c5-4544-8279-449668534ba2-profile-collector-cert\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.582235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5a433bc-0ca9-4a65-8df6-b065c131f20c-trusted-ca\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.583290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-config-volume\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.583534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-apiservice-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.584842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-ca\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.584952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.585313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.585381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.587808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a12af2-1536-48da-9867-3eb65b8c82cb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xfqlg\" (UID: \"98a12af2-1536-48da-9867-3eb65b8c82cb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.589741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-webhook-cert\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.591952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-etcd-client\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.592907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44a7670a-caef-4c90-ac73-c3d959b5b312-serving-cert\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.593338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.594812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/293d7797-862d-4cbf-ac7f-16ef038ac6aa-serving-cert\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.599083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-proxy-tls\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.599087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5vmh\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.599835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.599909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-stats-auth\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.601141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/754eba16-66a2-4f45-a7b0-77859b76c469-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.601776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.603847 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rfrvr"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdd9c623-4f99-434b-9142-374f3798a39e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7a030e0-e24c-4c03-a1f0-0840c7eec829-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f74503e4-ab5f-4eea-a47a-cc961035d3b7-cert\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/055a78a2-9ef0-47da-a1b4-3528b2370dd1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605500 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/61f0238b-c399-46bc-9f9b-c74d5310db41-certs\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.605610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a2834ef-28bc-4393-9acf-f78292d07b41-metrics-certs\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.612889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" event={"ID":"76c21345-d376-4531-a431-faa389fc0623","Type":"ContainerStarted","Data":"e7aeaf77f7da55fffe1535ff45d621a6c27dc81a02cf4d8e71959b08f7cb990f"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.620696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" event={"ID":"5eee50b2-fb76-4a78-a3f9-b364421a4178","Type":"ContainerStarted","Data":"0cd234751d970185ed254a12065931b8b0f13788b76c1c43c2ae53897b1c2055"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.620735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" event={"ID":"5eee50b2-fb76-4a78-a3f9-b364421a4178","Type":"ContainerStarted","Data":"af86a6852733fb5735ef88d9faacac0580aa1b2f5cc7ae6d127f60601df5180c"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.621538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" event={"ID":"837b8ba9-ad4e-4186-be8a-60807351bf87","Type":"ContainerStarted","Data":"2a9ea25782240a49504820a6eb0cc6f479528b718fb5c82f3c8fde8553483644"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.622696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" event={"ID":"34230523-16b1-4cba-8dfe-bfd3f0127944","Type":"ContainerStarted","Data":"c0cf6397a2cbb8ff7a8aeeb78350876ef5e732260634c5f49b2fb78671d842d7"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.622737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" event={"ID":"34230523-16b1-4cba-8dfe-bfd3f0127944","Type":"ContainerStarted","Data":"2176e1fa8a6e80346a8b3a69ff336cd021c98a2549cb96150dc96918249cf730"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.624466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" event={"ID":"ee70ad86-2c43-4d87-9465-9c020b5b4cec","Type":"ContainerStarted","Data":"34c4ddb7f564c96592f48cf93c71566586ed6181ca5ed4b943f81465f661ea25"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.625711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" event={"ID":"a0e416b4-b82c-46d1-a612-9736b8e6db14","Type":"ContainerStarted","Data":"3a24f3c2b3abcaf5ae705793b8ff2c6cd522a28b41f8eb9916b21b8928a7106f"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.625737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" event={"ID":"a0e416b4-b82c-46d1-a612-9736b8e6db14","Type":"ContainerStarted","Data":"5f480f967cbaaf24ed5cb9f8f9bba9b896ac11aa55e6e64439cfef7dbf21af56"} Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.657512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.658007 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.157994167 +0000 UTC m=+207.519820187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.703413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2scs"] Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.759286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.759558 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.259515616 +0000 UTC m=+207.621341666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.759983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.760326 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.260309437 +0000 UTC m=+207.622135467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.800941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7lpx\" (UniqueName: \"kubernetes.io/projected/a062daf8-94d8-4c83-a020-ab3a8c86b752-kube-api-access-j7lpx\") pod \"openshift-controller-manager-operator-756b6f6bc6-b694p\" (UID: \"a062daf8-94d8-4c83-a020-ab3a8c86b752\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.808194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.808375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwh5\" (UniqueName: \"kubernetes.io/projected/f74503e4-ab5f-4eea-a47a-cc961035d3b7-kube-api-access-pmwh5\") pod \"ingress-canary-qwfdr\" (UID: \"f74503e4-ab5f-4eea-a47a-cc961035d3b7\") " pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.809513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmtm\" (UniqueName: \"kubernetes.io/projected/6a2834ef-28bc-4393-9acf-f78292d07b41-kube-api-access-vhmtm\") pod \"router-default-5444994796-zlfns\" (UID: \"6a2834ef-28bc-4393-9acf-f78292d07b41\") " pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.809635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntc9\" (UniqueName: \"kubernetes.io/projected/26ee8334-f4bb-440d-831d-52339abe2875-kube-api-access-2ntc9\") pod \"kube-storage-version-migrator-operator-b67b599dd-lldcc\" (UID: \"26ee8334-f4bb-440d-831d-52339abe2875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.813004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmp2\" (UniqueName: \"kubernetes.io/projected/f986db0d-78ce-4c23-8b36-acb7eddecfd7-kube-api-access-vvmp2\") pod \"machine-config-controller-84d6567774-f25f5\" (UID: \"f986db0d-78ce-4c23-8b36-acb7eddecfd7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.813425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ddf1184-18c2-4665-8198-0422d2bfd272-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4knq5\" (UID: \"6ddf1184-18c2-4665-8198-0422d2bfd272\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.813815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqfz4\" (UniqueName: \"kubernetes.io/projected/ffa156ca-788a-4c89-ac8e-e24a8344e7ba-kube-api-access-tqfz4\") pod \"dns-default-8jn8v\" (UID: \"ffa156ca-788a-4c89-ac8e-e24a8344e7ba\") " pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.814041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdt8z\" (UniqueName: \"kubernetes.io/projected/9c7a3756-1691-4e02-b3ac-4b87d324f6a7-kube-api-access-hdt8z\") pod \"packageserver-d55dfcdfc-5v7c4\" (UID: \"9c7a3756-1691-4e02-b3ac-4b87d324f6a7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.814775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxr26\" (UniqueName: \"kubernetes.io/projected/85efff27-fc96-4191-9733-a6a2434a723c-kube-api-access-hxr26\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rbvc\" (UID: \"85efff27-fc96-4191-9733-a6a2434a723c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.820537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.841579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdd9c623-4f99-434b-9142-374f3798a39e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-k8mtd\" (UID: \"fdd9c623-4f99-434b-9142-374f3798a39e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.844741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qwfdr" Feb 25 07:20:53 crc kubenswrapper[4749]: W0225 07:20:53.852470 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod197fe25d_5e80_4fd6_84a2_3596e19e3703.slice/crio-7ef182d58948878f25ddaab0458915531611cc3f63ce366b8f3d4c72232ed1fb WatchSource:0}: Error finding container 7ef182d58948878f25ddaab0458915531611cc3f63ce366b8f3d4c72232ed1fb: Status 404 returned error can't find the container with id 7ef182d58948878f25ddaab0458915531611cc3f63ce366b8f3d4c72232ed1fb Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.866410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flz6x\" (UniqueName: \"kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x\") pod \"auto-csr-approver-29533400-6w2xx\" (UID: \"2a8700de-dc40-4245-8c99-e792c342b5bb\") " pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.866584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.868668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.368645483 +0000 UTC m=+207.730471513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.869088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.869585 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.369572428 +0000 UTC m=+207.731398448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.874915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdkql\" (UniqueName: \"kubernetes.io/projected/b78a8a4c-c2c3-4add-8295-2a9edbeb37df-kube-api-access-bdkql\") pod \"machine-config-operator-74547568cd-vkrp5\" (UID: \"b78a8a4c-c2c3-4add-8295-2a9edbeb37df\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.890346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.893261 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.897111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rnj\" (UniqueName: \"kubernetes.io/projected/055a78a2-9ef0-47da-a1b4-3528b2370dd1-kube-api-access-d6rnj\") pod \"package-server-manager-789f6589d5-5x8pj\" (UID: \"055a78a2-9ef0-47da-a1b4-3528b2370dd1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.916727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndcs\" (UniqueName: \"kubernetes.io/projected/d7a030e0-e24c-4c03-a1f0-0840c7eec829-kube-api-access-qndcs\") pod \"olm-operator-6b444d44fb-2cgdt\" (UID: \"d7a030e0-e24c-4c03-a1f0-0840c7eec829\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.942058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptzl\" (UniqueName: \"kubernetes.io/projected/61f0238b-c399-46bc-9f9b-c74d5310db41-kube-api-access-4ptzl\") pod \"machine-config-server-rh464\" (UID: \"61f0238b-c399-46bc-9f9b-c74d5310db41\") " pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.961524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78x26\" (UniqueName: \"kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26\") pod \"collect-profiles-29533395-7vrbg\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.962107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.970259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.970501 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.47048396 +0000 UTC m=+207.832309980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.970618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.973102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfw4d\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-kube-api-access-nfw4d\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:53 crc kubenswrapper[4749]: E0225 07:20:53.973141 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.473127682 +0000 UTC m=+207.834953702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.973393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" Feb 25 07:20:53 crc kubenswrapper[4749]: I0225 07:20:53.986348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.000084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjxjc\" (UniqueName: \"kubernetes.io/projected/94a0babc-26c5-4544-8279-449668534ba2-kube-api-access-hjxjc\") pod \"catalog-operator-68c6474976-thwbg\" (UID: \"94a0babc-26c5-4544-8279-449668534ba2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.003263 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.015200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.037274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cq2h\" (UniqueName: \"kubernetes.io/projected/293d7797-862d-4cbf-ac7f-16ef038ac6aa-kube-api-access-7cq2h\") pod \"service-ca-operator-777779d784-wxblp\" (UID: \"293d7797-862d-4cbf-ac7f-16ef038ac6aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.038198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nfl\" (UniqueName: \"kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl\") pod \"marketplace-operator-79b997595-djp99\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.038631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.046985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.047483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mqgw7"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.052209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br7hq\" (UniqueName: \"kubernetes.io/projected/3e2f1545-81aa-4204-bbcf-333ccd9c000e-kube-api-access-br7hq\") pod \"csi-hostpathplugin-9jt7w\" (UID: \"3e2f1545-81aa-4204-bbcf-333ccd9c000e\") " pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.053996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.061140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.067579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.068894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfmz\" (UniqueName: \"kubernetes.io/projected/743f91d6-aff7-4cc8-92d4-41245a3ffa9a-kube-api-access-lnfmz\") pod \"migrator-59844c95c7-9xjkb\" (UID: \"743f91d6-aff7-4cc8-92d4-41245a3ffa9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.072978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.073182 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.57315251 +0000 UTC m=+207.934978530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.073240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.073580 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.573568751 +0000 UTC m=+207.935394761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.090859 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.101313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxz7\" (UniqueName: \"kubernetes.io/projected/9c785750-976e-4ed9-a9dc-e8df35faeb94-kube-api-access-fdxz7\") pod \"service-ca-9c57cc56f-jscrf\" (UID: \"9c785750-976e-4ed9-a9dc-e8df35faeb94\") " pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.109886 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.114052 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a433bc-0ca9-4a65-8df6-b065c131f20c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sqwk5\" (UID: \"e5a433bc-0ca9-4a65-8df6-b065c131f20c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.114114 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.125181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.132747 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.141414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84pr\" (UniqueName: \"kubernetes.io/projected/754eba16-66a2-4f45-a7b0-77859b76c469-kube-api-access-r84pr\") pod \"multus-admission-controller-857f4d67dd-8pv62\" (UID: \"754eba16-66a2-4f45-a7b0-77859b76c469\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.149454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.154475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnk94\" (UniqueName: \"kubernetes.io/projected/44a7670a-caef-4c90-ac73-c3d959b5b312-kube-api-access-vnk94\") pod \"etcd-operator-b45778765-lhs4d\" (UID: \"44a7670a-caef-4c90-ac73-c3d959b5b312\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.160317 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.164259 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.169800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.175717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rh464" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.176466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.177088 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.677074085 +0000 UTC m=+208.038900105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.261510 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qwfdr"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.281073 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.282176 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.282573 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.782558172 +0000 UTC m=+208.144384192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.298352 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.304551 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jwdk8"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.308380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.312434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.328056 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.335133 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.386651 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.387101 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.887087173 +0000 UTC m=+208.248913193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.398301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.488876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.489398 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:54.989387513 +0000 UTC m=+208.351213533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.510206 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p"] Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.592266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.593018 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.092895586 +0000 UTC m=+208.454721606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.593106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.593513 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.093505313 +0000 UTC m=+208.455331333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.655933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" event={"ID":"98a12af2-1536-48da-9867-3eb65b8c82cb","Type":"ContainerStarted","Data":"f0cd6679caba3d83dc423229963d7d19f9fd19b05254f100c6aeac5f936b1678"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.658103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" event={"ID":"76c21345-d376-4531-a431-faa389fc0623","Type":"ContainerStarted","Data":"2209e480368b3aeec65634e662859894911cb4e758a2ec2392672aa5d57ffb84"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.658321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.660362 4749 generic.go:334] "Generic (PLEG): container finished" podID="837b8ba9-ad4e-4186-be8a-60807351bf87" containerID="29b117886264a323c5e12e36cc60c848094757077ba8c160844f22c24a0740a9" exitCode=0 Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.660448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" event={"ID":"837b8ba9-ad4e-4186-be8a-60807351bf87","Type":"ContainerDied","Data":"29b117886264a323c5e12e36cc60c848094757077ba8c160844f22c24a0740a9"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.661524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" event={"ID":"b699df8e-e341-4be3-9c4c-04e3b13d2737","Type":"ContainerStarted","Data":"1d12ee6b49d97644ca0fcf95d51637aaa3ce263510eb4ea2b0f62b2ed7ca621f"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.663719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" event={"ID":"197fe25d-5e80-4fd6-84a2-3596e19e3703","Type":"ContainerStarted","Data":"c0d292d333252ca2d61e0de8d0d735c58877ce577198c54c829c0645e188191b"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.663748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" event={"ID":"197fe25d-5e80-4fd6-84a2-3596e19e3703","Type":"ContainerStarted","Data":"7ef182d58948878f25ddaab0458915531611cc3f63ce366b8f3d4c72232ed1fb"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.666070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" event={"ID":"5eee50b2-fb76-4a78-a3f9-b364421a4178","Type":"ContainerStarted","Data":"5513b85c9ab7f32097168e38fd01a40b63ffb7417573b5f5548d6a7caeb4632f"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.667493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmg6x" event={"ID":"6943031e-49a1-441a-a659-579d68c5879a","Type":"ContainerStarted","Data":"dc7c54744a1379f9f41777ea133e56aa76904a82b7055d4ba09fa3200642dfd4"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.668617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jwdk8" event={"ID":"3f34750e-27ea-4d28-b6a9-0f65d2b29e92","Type":"ContainerStarted","Data":"828bb6fec234fdc2d6f05c11da59df96512f00e5b423a8b76a9b344a085682b5"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.669821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rh464" event={"ID":"61f0238b-c399-46bc-9f9b-c74d5310db41","Type":"ContainerStarted","Data":"bb4541d2fca259ce167e35c1eda5df18a339678836fdad533873f3a39c56f7d6"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.671124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zlfns" event={"ID":"6a2834ef-28bc-4393-9acf-f78292d07b41","Type":"ContainerStarted","Data":"16ec7106abbb7ea38de4a60497d56893903a3e025157177f0518ab6fc0ba7563"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.671966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" event={"ID":"e86f841c-6eab-4827-aa78-14db5b10dcc5","Type":"ContainerStarted","Data":"4212f9736f562e9dc897fff010456e11196102b632a33980869f8cf5f82a9e6a"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.673742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" event={"ID":"ee70ad86-2c43-4d87-9465-9c020b5b4cec","Type":"ContainerStarted","Data":"59721d31941aa82e42b587faee5625042d14bd43108445a9ae39bc50d9a454c2"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.674944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.678579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" event={"ID":"d5228e90-419a-4be6-a930-5391b35bee6b","Type":"ContainerStarted","Data":"256ce10f107302d46b87ca744b8d7abc067c0b07aef553dd6cacc0944725afc2"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.683663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w2scs" event={"ID":"e7f31545-040d-4d7a-8466-e0eaca760b03","Type":"ContainerStarted","Data":"bbd865a5513bd20c470bb1c98b93826fea546596bfcba95bf02e18afe5a1bc26"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.686490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qwfdr" event={"ID":"f74503e4-ab5f-4eea-a47a-cc961035d3b7","Type":"ContainerStarted","Data":"e33509b79422695250fc4175ccfa4c929a94c8515995771e0cb1bd542c5baf80"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.689917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" event={"ID":"723f3a22-e0cb-4b03-952d-7f4e6aece976","Type":"ContainerStarted","Data":"505d5f1130d6ca15f8680277868dd84bfdb66685ce860f3e7fb5688947ac3160"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.690965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" event={"ID":"1209de54-9bcd-424b-836c-77a8b90e494f","Type":"ContainerStarted","Data":"a691e67a912d583f4d72a58042b009b818e7980d0b272e22fd7a35ee93638f81"} Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.696039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.696190 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.196168203 +0000 UTC m=+208.557994223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.696555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.696862 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.196845581 +0000 UTC m=+208.558671601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.797142 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.797425 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.297399514 +0000 UTC m=+208.659225534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.797731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.798856 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.298838783 +0000 UTC m=+208.660664803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.936701 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-x5ctv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.936757 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.937120 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cdjg8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.937137 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" podUID="76c21345-d376-4531-a431-faa389fc0623" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Feb 25 07:20:54 crc kubenswrapper[4749]: I0225 07:20:54.939952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:54 crc kubenswrapper[4749]: E0225 07:20:54.940535 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.440512991 +0000 UTC m=+208.802339011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.045743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.046216 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.546205544 +0000 UTC m=+208.908031564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.146728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.147104 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.647090505 +0000 UTC m=+209.008916525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.200151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.223475 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jn8v"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.239028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.248808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.249098 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.749086807 +0000 UTC m=+209.110912817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.328686 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pzj5d" podStartSLOduration=155.328667705 podStartE2EDuration="2m35.328667705s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:55.32775764 +0000 UTC m=+208.689583660" watchObservedRunningTime="2026-02-25 07:20:55.328667705 +0000 UTC m=+208.690493725" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.352039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.352418 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.852404334 +0000 UTC m=+209.214230354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.453521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.453935 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:55.953916783 +0000 UTC m=+209.315742803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.489321 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qqx6g" podStartSLOduration=155.489301571 podStartE2EDuration="2m35.489301571s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:55.488229182 +0000 UTC m=+208.850055202" watchObservedRunningTime="2026-02-25 07:20:55.489301571 +0000 UTC m=+208.851127581" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.555043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.555212 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.055195525 +0000 UTC m=+209.417021545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.561142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.561254 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.061215489 +0000 UTC m=+209.423041509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.662746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.663073 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.163058717 +0000 UTC m=+209.524884737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.667380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.692697 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.716941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" event={"ID":"d5228e90-419a-4be6-a930-5391b35bee6b","Type":"ContainerStarted","Data":"84a4c537bf6e2fab487a2510a1a7b97c31afd6bc540ba228296b1e950f431be9"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.748200 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.750003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.754197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" event={"ID":"30e28bec-ff89-41e7-93c9-da957325acb0","Type":"ContainerStarted","Data":"bf4d207a366883c0d5227d077fb89ad515d9ae7b3690100b0690b2a0633e0eec"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.754321 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" event={"ID":"30e28bec-ff89-41e7-93c9-da957325acb0","Type":"ContainerStarted","Data":"c12a73922d9300859e405984550fa64b484ec2c6f0666750d5bb7e1780f9ea1a"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.754387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" event={"ID":"30e28bec-ff89-41e7-93c9-da957325acb0","Type":"ContainerStarted","Data":"9c5e1e6e2a885e07f695febe271a362584b47e48e7ed9450882fb48874f545b2"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.755390 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.764625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.768572 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.268554796 +0000 UTC m=+209.630380816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.788039 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.792753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" event={"ID":"98a12af2-1536-48da-9867-3eb65b8c82cb","Type":"ContainerStarted","Data":"dc761986bfc4718781abb99eb603cec02e85f4ccffd039124ed79be29934027e"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.832941 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.832992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.849491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w2scs" event={"ID":"e7f31545-040d-4d7a-8466-e0eaca760b03","Type":"ContainerStarted","Data":"2d6690dd2d20d234fc3b67078aa861dbcebcdf56035ef5d2b18bd5c142699936"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.850631 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.867279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.868330 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-w2scs container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.868356 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w2scs" podUID="e7f31545-040d-4d7a-8466-e0eaca760b03" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.869162 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.369147408 +0000 UTC m=+209.730973428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.902377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jwdk8" event={"ID":"3f34750e-27ea-4d28-b6a9-0f65d2b29e92","Type":"ContainerStarted","Data":"0bea22529879541b2de90123709ff2bb8dfbbafd8d6a4a7917f60597a13d692d"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.903475 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.904325 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-jwdk8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.904360 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jwdk8" podUID="3f34750e-27ea-4d28-b6a9-0f65d2b29e92" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.915763 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-jrlsf" podStartSLOduration=155.915748694 podStartE2EDuration="2m35.915748694s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:55.914092688 +0000 UTC m=+209.275918708" watchObservedRunningTime="2026-02-25 07:20:55.915748694 +0000 UTC m=+209.277574714" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.916339 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" podStartSLOduration=155.91633359 podStartE2EDuration="2m35.91633359s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:55.889924507 +0000 UTC m=+209.251750527" watchObservedRunningTime="2026-02-25 07:20:55.91633359 +0000 UTC m=+209.278159610" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.948456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.959985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jscrf"] Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.971685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:55 crc kubenswrapper[4749]: E0225 07:20:55.972166 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.472152508 +0000 UTC m=+209.833978528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.976844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rh464" event={"ID":"61f0238b-c399-46bc-9f9b-c74d5310db41","Type":"ContainerStarted","Data":"a795c225e049ffe3265bf286773a349c5c211e39ed6fce97ecbd5be918921989"} Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.981910 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" podStartSLOduration=155.981892124 podStartE2EDuration="2m35.981892124s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:55.980221739 +0000 UTC m=+209.342047759" watchObservedRunningTime="2026-02-25 07:20:55.981892124 +0000 UTC m=+209.343718144" Feb 25 07:20:55 crc kubenswrapper[4749]: I0225 07:20:55.999757 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9jt7w"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.000858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" event={"ID":"b78a8a4c-c2c3-4add-8295-2a9edbeb37df","Type":"ContainerStarted","Data":"528790bf6e4ac722cdfb3a7f2e90458d32973c1d1bc651e6d6b718cd36e2a208"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.036246 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.036306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wxblp"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.041056 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.044192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" event={"ID":"26ee8334-f4bb-440d-831d-52339abe2875","Type":"ContainerStarted","Data":"e05fc35f133b48368bd4b7ab9e3f84d8a8c3e1ba65bc1fc1c8c26c5f2b142baf"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.046452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.048735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qwfdr" event={"ID":"f74503e4-ab5f-4eea-a47a-cc961035d3b7","Type":"ContainerStarted","Data":"bd76f546bf033695552210722c4fde4e732bdb3e1c5644a92c171aec24e7c382"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.052169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lhs4d"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.053336 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.053400 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jwdk8" podStartSLOduration=156.053376451 podStartE2EDuration="2m36.053376451s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.032955911 +0000 UTC m=+209.394781931" watchObservedRunningTime="2026-02-25 07:20:56.053376451 +0000 UTC m=+209.415202471" Feb 25 07:20:56 crc kubenswrapper[4749]: W0225 07:20:56.062315 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod293d7797_862d_4cbf_ac7f_16ef038ac6aa.slice/crio-68f35d5bd1733fa5f761e1ea4953857ec6a9eb59e2c61649d4b009e6a6525fdd WatchSource:0}: Error finding container 68f35d5bd1733fa5f761e1ea4953857ec6a9eb59e2c61649d4b009e6a6525fdd: Status 404 returned error can't find the container with id 68f35d5bd1733fa5f761e1ea4953857ec6a9eb59e2c61649d4b009e6a6525fdd Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.075291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xfqlg" podStartSLOduration=156.07526896 podStartE2EDuration="2m36.07526896s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.064903276 +0000 UTC m=+209.426729296" watchObservedRunningTime="2026-02-25 07:20:56.07526896 +0000 UTC m=+209.437094990" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.077378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.078446 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.578385926 +0000 UTC m=+209.940211946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.079547 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533400-6w2xx"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.083040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.086056 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.586026464 +0000 UTC m=+209.947852484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.105338 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wqfqj" podStartSLOduration=156.105320052 podStartE2EDuration="2m36.105320052s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.089744096 +0000 UTC m=+209.451570116" watchObservedRunningTime="2026-02-25 07:20:56.105320052 +0000 UTC m=+209.467146072" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.109461 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8pv62"] Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.117805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" event={"ID":"197fe25d-5e80-4fd6-84a2-3596e19e3703","Type":"ContainerStarted","Data":"736526cee186c2e3de242ab1883531e9796214e900d104e65f930317c495a64e"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.130576 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rh464" podStartSLOduration=6.130536613 podStartE2EDuration="6.130536613s" podCreationTimestamp="2026-02-25 07:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.129070023 +0000 UTC m=+209.490896043" watchObservedRunningTime="2026-02-25 07:20:56.130536613 +0000 UTC m=+209.492362623" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.156498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmg6x" event={"ID":"6943031e-49a1-441a-a659-579d68c5879a","Type":"ContainerStarted","Data":"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.167698 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-w2scs" podStartSLOduration=156.16768193 podStartE2EDuration="2m36.16768193s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.166326233 +0000 UTC m=+209.528152253" watchObservedRunningTime="2026-02-25 07:20:56.16768193 +0000 UTC m=+209.529507950" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.184959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.185192 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.685168398 +0000 UTC m=+210.046994418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.185235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.186274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" event={"ID":"a062daf8-94d8-4c83-a020-ab3a8c86b752","Type":"ContainerStarted","Data":"45ce9ea18518c6aab83ffddb526e888ae3c4d7dd5b56b8709788b931116c65a5"} Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.188774 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.688759087 +0000 UTC m=+210.050585107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.193417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jn8v" event={"ID":"ffa156ca-788a-4c89-ac8e-e24a8344e7ba","Type":"ContainerStarted","Data":"b4b3b2f3859f5fc7b2b9a65994cef8ccda1456b52c4dc09ffaf8b89ac72a2710"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.220941 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wmg6x" podStartSLOduration=156.220923407 podStartE2EDuration="2m36.220923407s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.220137025 +0000 UTC m=+209.581963045" watchObservedRunningTime="2026-02-25 07:20:56.220923407 +0000 UTC m=+209.582749427" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.221671 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tm8jc" podStartSLOduration=156.221666217 podStartE2EDuration="2m36.221666217s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.191584244 +0000 UTC m=+209.553410264" watchObservedRunningTime="2026-02-25 07:20:56.221666217 +0000 UTC m=+209.583492237" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.224972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" event={"ID":"837b8ba9-ad4e-4186-be8a-60807351bf87","Type":"ContainerStarted","Data":"5defb0980d0958e4a6117c80e2f59ff3e298bc41e04b675f9d250de49f4cf809"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.236120 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.250773 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" podStartSLOduration=156.250735553 podStartE2EDuration="2m36.250735553s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.245649463 +0000 UTC m=+209.607475483" watchObservedRunningTime="2026-02-25 07:20:56.250735553 +0000 UTC m=+209.612561573" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.253855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zlfns" event={"ID":"6a2834ef-28bc-4393-9acf-f78292d07b41","Type":"ContainerStarted","Data":"5b6781f19206e1270552aa8748b4670fc6964d0fecc3196366a8b17b3bf29638"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.255493 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34786: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.263740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" event={"ID":"e86f841c-6eab-4827-aa78-14db5b10dcc5","Type":"ContainerStarted","Data":"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.264722 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.270042 4749 generic.go:334] "Generic (PLEG): container finished" podID="1209de54-9bcd-424b-836c-77a8b90e494f" containerID="91c291e14de3e6af560b1a0c14f551e389aadfa485924682fbdd039639850170" exitCode=0 Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.270108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" event={"ID":"1209de54-9bcd-424b-836c-77a8b90e494f","Type":"ContainerDied","Data":"91c291e14de3e6af560b1a0c14f551e389aadfa485924682fbdd039639850170"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.283641 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qwfdr" podStartSLOduration=6.283626233 podStartE2EDuration="6.283626233s" podCreationTimestamp="2026-02-25 07:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.281122875 +0000 UTC m=+209.642948895" watchObservedRunningTime="2026-02-25 07:20:56.283626233 +0000 UTC m=+209.645452243" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.289870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" event={"ID":"723f3a22-e0cb-4b03-952d-7f4e6aece976","Type":"ContainerStarted","Data":"5e0fa2ebc5868c2b0093ea747ddb34930d2e5be148b9a7d3e81f41c4349b1e83"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.289908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" event={"ID":"723f3a22-e0cb-4b03-952d-7f4e6aece976","Type":"ContainerStarted","Data":"389f7a3a0c00d8eb0f5ab939e6bd57151d8250bd409722ca5e6d73185bf44429"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.297361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.298229 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.798214412 +0000 UTC m=+210.160040422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.309878 4749 generic.go:334] "Generic (PLEG): container finished" podID="b699df8e-e341-4be3-9c4c-04e3b13d2737" containerID="1c05cf6a574d380a15d976f98569365d795bf4e27f76388b03587ef1012ab2a2" exitCode=0 Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.313256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" event={"ID":"b699df8e-e341-4be3-9c4c-04e3b13d2737","Type":"ContainerDied","Data":"1c05cf6a574d380a15d976f98569365d795bf4e27f76388b03587ef1012ab2a2"} Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.332688 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.333082 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.340181 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" podStartSLOduration=156.340163211 podStartE2EDuration="2m36.340163211s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.306641273 +0000 UTC m=+209.668467293" watchObservedRunningTime="2026-02-25 07:20:56.340163211 +0000 UTC m=+209.701989221" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.341344 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rfrvr" podStartSLOduration=156.341338903 podStartE2EDuration="2m36.341338903s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.340413157 +0000 UTC m=+209.702239187" watchObservedRunningTime="2026-02-25 07:20:56.341338903 +0000 UTC m=+209.703164923" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.350713 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34788: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.399835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.407563 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:56.907266807 +0000 UTC m=+210.269092817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.447669 4749 ???:1] "http: TLS handshake error from 192.168.126.11:34800: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.463077 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zlfns" podStartSLOduration=156.463063195 podStartE2EDuration="2m36.463063195s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.461359787 +0000 UTC m=+209.823185807" watchObservedRunningTime="2026-02-25 07:20:56.463063195 +0000 UTC m=+209.824889215" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.501240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.501397 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.001366973 +0000 UTC m=+210.363193003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.501811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.502174 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.002158744 +0000 UTC m=+210.363984764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.531102 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mqgw7" podStartSLOduration=156.531086906 podStartE2EDuration="2m36.531086906s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.529443901 +0000 UTC m=+209.891269921" watchObservedRunningTime="2026-02-25 07:20:56.531086906 +0000 UTC m=+209.892912926" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.550411 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40378: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.603743 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.604165 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.104147876 +0000 UTC m=+210.465973896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.642980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.652797 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40386: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.707983 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" podStartSLOduration=156.707964647 podStartE2EDuration="2m36.707964647s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:56.580086817 +0000 UTC m=+209.941912837" watchObservedRunningTime="2026-02-25 07:20:56.707964647 +0000 UTC m=+210.069790667" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.708474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.708799 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.20878592 +0000 UTC m=+210.570611940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.765760 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40388: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.811415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.811655 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.311641296 +0000 UTC m=+210.673467316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.873186 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40402: no serving certificate available for the kubelet" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.917414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:56 crc kubenswrapper[4749]: E0225 07:20:56.917799 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.41778055 +0000 UTC m=+210.779606630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.965252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.976730 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:20:56 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:20:56 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:20:56 crc kubenswrapper[4749]: healthz check failed Feb 25 07:20:56 crc kubenswrapper[4749]: I0225 07:20:56.976820 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.021148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.021278 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.521256382 +0000 UTC m=+210.883082402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.021677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.022100 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.522087276 +0000 UTC m=+210.883913296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.057343 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40404: no serving certificate available for the kubelet" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.128208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.128339 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.628323163 +0000 UTC m=+210.990149183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.128390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.128774 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.628746365 +0000 UTC m=+210.990572385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.230436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.230666 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.730645534 +0000 UTC m=+211.092471544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.331487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.331958 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.831946356 +0000 UTC m=+211.193772376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.388568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerStarted","Data":"d29f40878a16488090d33422670ff524a2e1b2c5009b60a5046e69d023b3cfd2"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.388628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerStarted","Data":"77cfa9dfadfb16c289ffe8374c308b97f0576ecb16b8084362c927faacc5064e"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.389388 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.406303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" event={"ID":"9c785750-976e-4ed9-a9dc-e8df35faeb94","Type":"ContainerStarted","Data":"72d90fd2956c6ff413a773d9b442b3c81b51919315da0d53988d6c187a3fa568"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.406344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" event={"ID":"9c785750-976e-4ed9-a9dc-e8df35faeb94","Type":"ContainerStarted","Data":"9e9f8d2a879c5a4daa11c7e375f2d342f5813253ff9769a4a71449e5f76b6dd0"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.407216 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-djp99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.407241 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.417367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" event={"ID":"754eba16-66a2-4f45-a7b0-77859b76c469","Type":"ContainerStarted","Data":"4169f900c8be38751b278fd8967e065dafa921192b397f2d0fdd40a460939121"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.432429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.433639 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:57.933576708 +0000 UTC m=+211.295402728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.453575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" event={"ID":"f986db0d-78ce-4c23-8b36-acb7eddecfd7","Type":"ContainerStarted","Data":"12a1906ac7ca78266373126bbc60f60344cc4ed0ca21ff944831251fecb6b5b2"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.453625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" event={"ID":"f986db0d-78ce-4c23-8b36-acb7eddecfd7","Type":"ContainerStarted","Data":"8ac16d3299ccb910f66fb751591db303c8a579280106dd7f1e5f046ddfe7dba4"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.470741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jn8v" event={"ID":"ffa156ca-788a-4c89-ac8e-e24a8344e7ba","Type":"ContainerStarted","Data":"add0a8f92c309f1a7cf334beffd2dc5b21465fda0f236006f91af983f95f240f"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.486499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" event={"ID":"055a78a2-9ef0-47da-a1b4-3528b2370dd1","Type":"ContainerStarted","Data":"ceb826a7803482fe372e9f72b49a8cc33eff34c698392bc0e017a9a5e2486752"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.486537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" event={"ID":"055a78a2-9ef0-47da-a1b4-3528b2370dd1","Type":"ContainerStarted","Data":"59e9cf4d1ac67e64be8446017e67fa93201c354897ecad217ff2d1bfec8bd6a5"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.523983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" event={"ID":"bec106c3-2cb1-4c08-8b5a-920797ec4142","Type":"ContainerStarted","Data":"3e1f4ce278983d29157591a2b038e7aa63556219a96f8a79dba5e9130a1a0df4"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.524025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" event={"ID":"bec106c3-2cb1-4c08-8b5a-920797ec4142","Type":"ContainerStarted","Data":"5fe74052c264a472c66b3552add0383235230ec340b99faef63a4a277d66d8a8"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.533789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.534823 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.034807659 +0000 UTC m=+211.396633679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.551116 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.591272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" event={"ID":"3e2f1545-81aa-4204-bbcf-333ccd9c000e","Type":"ContainerStarted","Data":"a02e17a4cd778da6d30f4071ac7a30c1bce3b56eb149dc579dade5d159505bfa"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.619213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" event={"ID":"837b8ba9-ad4e-4186-be8a-60807351bf87","Type":"ContainerStarted","Data":"a696128e04ebb093156d206bd94b650a057976c634f44faef648fe6bed100107"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.632334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" event={"ID":"d7a030e0-e24c-4c03-a1f0-0840c7eec829","Type":"ContainerStarted","Data":"6f1fdb3c204db2a744e76d0012a3020207ea5e991288ab786b1d378a62fb1bde"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.632392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" event={"ID":"d7a030e0-e24c-4c03-a1f0-0840c7eec829","Type":"ContainerStarted","Data":"fc67bf5f79c5e3851addbafa8860bf555586cfa7c1ca803b413fd8bfb5369a11"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.632683 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.634468 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.635333 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.13531732 +0000 UTC m=+211.497143340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.642736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" event={"ID":"85efff27-fc96-4191-9733-a6a2434a723c","Type":"ContainerStarted","Data":"5378ae6490ddcb393aff789f917712f90bff2ecaacf504aa2d4b538b6ef1d440"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.642780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" event={"ID":"85efff27-fc96-4191-9733-a6a2434a723c","Type":"ContainerStarted","Data":"8d50f42cff1d0f9624b0c5abd36f2f0af4a0154f9bca3b0aa39a50b4bbc21d5b"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.646209 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.647869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" event={"ID":"2a8700de-dc40-4245-8c99-e792c342b5bb","Type":"ContainerStarted","Data":"ca92fcd2cee8b86501e97ca8080b59de90b813c81f422cc83fdb095f715ed219"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.669891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" event={"ID":"9c7a3756-1691-4e02-b3ac-4b87d324f6a7","Type":"ContainerStarted","Data":"df46292fbe00dc86abd78eeda025c8d157a41e14137dfb5c3f3565cd22ba86be"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.669941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" event={"ID":"9c7a3756-1691-4e02-b3ac-4b87d324f6a7","Type":"ContainerStarted","Data":"290acf4e4cffba7dfadfc3c668a8ce23774250be8355a2790a9d790dee56ee3e"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.670886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.680802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b694p" event={"ID":"a062daf8-94d8-4c83-a020-ab3a8c86b752","Type":"ContainerStarted","Data":"44152f00fd02e0efc3f4de77aa704ba34b01b17bacf854d86aac558883cb0923"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.684804 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5v7c4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.684839 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" podUID="9c7a3756-1691-4e02-b3ac-4b87d324f6a7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.701811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" event={"ID":"6ddf1184-18c2-4665-8198-0422d2bfd272","Type":"ContainerStarted","Data":"c251fe46766d7df0aa7ae16cee48760e9251ebd5ce76ec79673fc25024290d20"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.709220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" event={"ID":"fdd9c623-4f99-434b-9142-374f3798a39e","Type":"ContainerStarted","Data":"034dc8aa7c6fb71eee347ea43ee53f87ae01a3f2043c396c3b52154c5ff6c44f"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.709310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" event={"ID":"fdd9c623-4f99-434b-9142-374f3798a39e","Type":"ContainerStarted","Data":"d1f0daef67f4cce2cd16b1dd1087dc000a0cd5e36de7007e86ae7c40c350a627"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.736239 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.736493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" event={"ID":"e5a433bc-0ca9-4a65-8df6-b065c131f20c","Type":"ContainerStarted","Data":"5abd3700aa89460437783c787417bdb6a56b453b0431f115fc606eb084a95ff4"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.736635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" event={"ID":"e5a433bc-0ca9-4a65-8df6-b065c131f20c","Type":"ContainerStarted","Data":"54871c4256e0c15c621420220b2f39b63ada5e0434bf87c27887bbff9760f370"} Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.738397 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.23838471 +0000 UTC m=+211.600210730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.751534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" event={"ID":"293d7797-862d-4cbf-ac7f-16ef038ac6aa","Type":"ContainerStarted","Data":"3c01c00ab0d701fcc2fe1025e5445a87ef022653f3170f472df32e2a26d618a2"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.751574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" event={"ID":"293d7797-862d-4cbf-ac7f-16ef038ac6aa","Type":"ContainerStarted","Data":"68f35d5bd1733fa5f761e1ea4953857ec6a9eb59e2c61649d4b009e6a6525fdd"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.771059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lldcc" event={"ID":"26ee8334-f4bb-440d-831d-52339abe2875","Type":"ContainerStarted","Data":"130fe050e3f0b0ae9e509a2843741af6f2b4c6d2110f9b181b74a66e00b71409"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.774331 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40410: no serving certificate available for the kubelet" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.799865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" event={"ID":"94a0babc-26c5-4544-8279-449668534ba2","Type":"ContainerStarted","Data":"5a4efbc0e3446a65baed91e16969c1dc0c4d7b2e88db4c620a5ad9af520a494b"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.799905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" event={"ID":"94a0babc-26c5-4544-8279-449668534ba2","Type":"ContainerStarted","Data":"4ae3becbb0bcbc313295014bd84881505843869bec88749d45ba3d82728c7fae"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.801149 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.818689 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-thwbg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.818733 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" podUID="94a0babc-26c5-4544-8279-449668534ba2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.832354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" event={"ID":"b78a8a4c-c2c3-4add-8295-2a9edbeb37df","Type":"ContainerStarted","Data":"3a6c06f7b8347445ca1263b1a36dff1b25fe6732963c81d9a4fb5a54cee03805"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.832395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" event={"ID":"b78a8a4c-c2c3-4add-8295-2a9edbeb37df","Type":"ContainerStarted","Data":"eeddf027e8cf2c6d7b6c1f12ba7790ce3328d03fbca3f29fcb380a5400e76cf1"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.837476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.837922 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.337904324 +0000 UTC m=+211.699730344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.853483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" event={"ID":"743f91d6-aff7-4cc8-92d4-41245a3ffa9a","Type":"ContainerStarted","Data":"db8ea3b4975ba9f1fffe46fd0565c8bc48ae5f0669f06481b022e76059fdad06"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.853523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" event={"ID":"743f91d6-aff7-4cc8-92d4-41245a3ffa9a","Type":"ContainerStarted","Data":"77507621d0c99dea9dfd3c6fb4b708b80044878a32ebe4189da997900a75dedd"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.856822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" event={"ID":"44a7670a-caef-4c90-ac73-c3d959b5b312","Type":"ContainerStarted","Data":"e1131ccc393f8eb69ca65ea47130c27ce99ed41fd521e4fbe7132fc9c18f9b01"} Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.857114 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-jwdk8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.857147 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jwdk8" podUID="3f34750e-27ea-4d28-b6a9-0f65d2b29e92" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.870406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-w2scs" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.939356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:57 crc kubenswrapper[4749]: E0225 07:20:57.941565 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.441550102 +0000 UTC m=+211.803376122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.960383 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" podStartSLOduration=157.960352236 podStartE2EDuration="2m37.960352236s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:57.938322743 +0000 UTC m=+211.300148753" watchObservedRunningTime="2026-02-25 07:20:57.960352236 +0000 UTC m=+211.322178256" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.962144 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jscrf" podStartSLOduration=157.962138345 podStartE2EDuration="2m37.962138345s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:57.960000577 +0000 UTC m=+211.321826597" watchObservedRunningTime="2026-02-25 07:20:57.962138345 +0000 UTC m=+211.323964365" Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.990111 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:20:57 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:20:57 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:20:57 crc kubenswrapper[4749]: healthz check failed Feb 25 07:20:57 crc kubenswrapper[4749]: I0225 07:20:57.990198 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:57.999942 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" podStartSLOduration=157.999926619 podStartE2EDuration="2m37.999926619s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:57.992371603 +0000 UTC m=+211.354197633" watchObservedRunningTime="2026-02-25 07:20:57.999926619 +0000 UTC m=+211.361752639" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.028222 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vkrp5" podStartSLOduration=158.028204403 podStartE2EDuration="2m38.028204403s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.027423782 +0000 UTC m=+211.389249802" watchObservedRunningTime="2026-02-25 07:20:58.028204403 +0000 UTC m=+211.390030423" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.040887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.042086 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.542070463 +0000 UTC m=+211.903896483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.060231 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" podStartSLOduration=158.060213739 podStartE2EDuration="2m38.060213739s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.058539064 +0000 UTC m=+211.420365084" watchObservedRunningTime="2026-02-25 07:20:58.060213739 +0000 UTC m=+211.422039759" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.091978 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rbvc" podStartSLOduration=158.091966269 podStartE2EDuration="2m38.091966269s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.091021433 +0000 UTC m=+211.452847453" watchObservedRunningTime="2026-02-25 07:20:58.091966269 +0000 UTC m=+211.453792289" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.138882 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.140806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.149277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.149584 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.649572765 +0000 UTC m=+212.011398785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.153060 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" podStartSLOduration=158.153040151 podStartE2EDuration="2m38.153040151s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.147747786 +0000 UTC m=+211.509573806" watchObservedRunningTime="2026-02-25 07:20:58.153040151 +0000 UTC m=+211.514866171" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.154919 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" podStartSLOduration=158.154910502 podStartE2EDuration="2m38.154910502s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.114535416 +0000 UTC m=+211.476361426" watchObservedRunningTime="2026-02-25 07:20:58.154910502 +0000 UTC m=+211.516736522" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.251130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.251426 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.751409633 +0000 UTC m=+212.113235653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.324753 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" podStartSLOduration=158.32473701 podStartE2EDuration="2m38.32473701s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.291985234 +0000 UTC m=+211.653811254" watchObservedRunningTime="2026-02-25 07:20:58.32473701 +0000 UTC m=+211.686563020" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.352978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.353264 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.853253241 +0000 UTC m=+212.215079261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.375197 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wxblp" podStartSLOduration=158.37518014 podStartE2EDuration="2m38.37518014s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.37479411 +0000 UTC m=+211.736620130" watchObservedRunningTime="2026-02-25 07:20:58.37518014 +0000 UTC m=+211.737006160" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.456549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.456849 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:58.956834476 +0000 UTC m=+212.318660486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.553676 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" podStartSLOduration=158.553656596 podStartE2EDuration="2m38.553656596s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.482036606 +0000 UTC m=+211.843862626" watchObservedRunningTime="2026-02-25 07:20:58.553656596 +0000 UTC m=+211.915482616" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.556047 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-k8mtd" podStartSLOduration=158.55603668 podStartE2EDuration="2m38.55603668s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.555552588 +0000 UTC m=+211.917378608" watchObservedRunningTime="2026-02-25 07:20:58.55603668 +0000 UTC m=+211.917862700" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.564786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.565244 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.065228682 +0000 UTC m=+212.427054702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.632342 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" podStartSLOduration=158.632323289 podStartE2EDuration="2m38.632323289s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.598896814 +0000 UTC m=+211.960722834" watchObservedRunningTime="2026-02-25 07:20:58.632323289 +0000 UTC m=+211.994149309" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.666254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.666429 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.166401281 +0000 UTC m=+212.528227301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.666576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.666873 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.166866725 +0000 UTC m=+212.528692745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.666921 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" podStartSLOduration=158.666904846 podStartE2EDuration="2m38.666904846s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.632773161 +0000 UTC m=+211.994599191" watchObservedRunningTime="2026-02-25 07:20:58.666904846 +0000 UTC m=+212.028730866" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.704268 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2cgdt" podStartSLOduration=158.704251737 podStartE2EDuration="2m38.704251737s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.665449436 +0000 UTC m=+212.027275446" watchObservedRunningTime="2026-02-25 07:20:58.704251737 +0000 UTC m=+212.066077757" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.767094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.767397 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.267382095 +0000 UTC m=+212.629208115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.868064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.868336 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.368323918 +0000 UTC m=+212.730149938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.883579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lhs4d" event={"ID":"44a7670a-caef-4c90-ac73-c3d959b5b312","Type":"ContainerStarted","Data":"c4844059c80acd3602b098133ce61ed531a64d4da489a17fd693daa6679718b9"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.897321 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" event={"ID":"b699df8e-e341-4be3-9c4c-04e3b13d2737","Type":"ContainerStarted","Data":"ddea6b71f1eb3d494feaaca4691f858265a423404705d66eda5d8e8503faaa5a"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.907067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" event={"ID":"3e2f1545-81aa-4204-bbcf-333ccd9c000e","Type":"ContainerStarted","Data":"956160e72ba085c74c770ed5793207a8240ae683ace96f4291208523cd126c7d"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.924359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" event={"ID":"743f91d6-aff7-4cc8-92d4-41245a3ffa9a","Type":"ContainerStarted","Data":"468db7d45df4c4c63801ddbd5c73ce65d9d31fe9fa312683a2afc9a83bd0948c"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.927688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" event={"ID":"f986db0d-78ce-4c23-8b36-acb7eddecfd7","Type":"ContainerStarted","Data":"43fdadd1c732de3accdffd0ef706ccc591b13ff1768f7557a2a0bf7221200fa7"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.944030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sqwk5" event={"ID":"e5a433bc-0ca9-4a65-8df6-b065c131f20c","Type":"ContainerStarted","Data":"d1c7463aca74a38b8894b0a23a6a0dcbf48d1f91cbce0a030cdeca28118951e4"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.956193 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xjkb" podStartSLOduration=158.956179283 podStartE2EDuration="2m38.956179283s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.706559051 +0000 UTC m=+212.068385071" watchObservedRunningTime="2026-02-25 07:20:58.956179283 +0000 UTC m=+212.318005303" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.958279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" event={"ID":"754eba16-66a2-4f45-a7b0-77859b76c469","Type":"ContainerStarted","Data":"15b3e04da01e129735c6859dd261cce70a236da44edb5fb0cbc62a74fdb4d9ce"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.958307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" event={"ID":"754eba16-66a2-4f45-a7b0-77859b76c469","Type":"ContainerStarted","Data":"aacc82960aa56f9e00d67bb0e1d9484ae1d3f7a3a5ba5ef71c7ed1b26eca1378"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.967077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" event={"ID":"1209de54-9bcd-424b-836c-77a8b90e494f","Type":"ContainerStarted","Data":"9fcec9a74170ca3b098282e02042604d84fb1f14a7cc41acd0dd9de0a51c0a38"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.968491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:58 crc kubenswrapper[4749]: E0225 07:20:58.969307 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.469293162 +0000 UTC m=+212.831119172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.971809 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:20:58 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:20:58 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:20:58 crc kubenswrapper[4749]: healthz check failed Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.972052 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.986515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jn8v" event={"ID":"ffa156ca-788a-4c89-ac8e-e24a8344e7ba","Type":"ContainerStarted","Data":"954b7684bc84b2327a0d236dec5f146bc616af106b8379b5de9526bd11813c1b"} Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.986561 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8jn8v" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.996045 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f25f5" podStartSLOduration=158.996026404 podStartE2EDuration="2m38.996026404s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.958271971 +0000 UTC m=+212.320097991" watchObservedRunningTime="2026-02-25 07:20:58.996026404 +0000 UTC m=+212.357852424" Feb 25 07:20:58 crc kubenswrapper[4749]: I0225 07:20:58.999131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4knq5" event={"ID":"6ddf1184-18c2-4665-8198-0422d2bfd272","Type":"ContainerStarted","Data":"87f0ed91743a86f0d458047916819a05a96d8e2b5def312006dfcee3c1688d96"} Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.022819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" event={"ID":"055a78a2-9ef0-47da-a1b4-3528b2370dd1","Type":"ContainerStarted","Data":"29a0c2ddd1d46c938c112992697a76effd04cc79421fcc4a767516a01c9ad68f"} Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.022858 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.029703 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" podStartSLOduration=159.029688005 podStartE2EDuration="2m39.029688005s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:59.028923754 +0000 UTC m=+212.390749774" watchObservedRunningTime="2026-02-25 07:20:59.029688005 +0000 UTC m=+212.391514025" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.029948 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8pv62" podStartSLOduration=159.029942682 podStartE2EDuration="2m39.029942682s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:58.990525953 +0000 UTC m=+212.352351973" watchObservedRunningTime="2026-02-25 07:20:59.029942682 +0000 UTC m=+212.391768702" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.031138 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-djp99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.031188 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.031552 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-thwbg" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.071338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.071663 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.571651434 +0000 UTC m=+212.933477454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.129683 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8jn8v" podStartSLOduration=8.129664791 podStartE2EDuration="8.129664791s" podCreationTimestamp="2026-02-25 07:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:59.086466439 +0000 UTC m=+212.448292459" watchObservedRunningTime="2026-02-25 07:20:59.129664791 +0000 UTC m=+212.491490801" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.163076 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40418: no serving certificate available for the kubelet" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.172900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.173210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.173492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.173587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.173790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.174463 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.674440767 +0000 UTC m=+213.036266787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.182194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.184493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.189104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.191270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.199118 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" podStartSLOduration=159.199102112 podStartE2EDuration="2m39.199102112s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:20:59.18440441 +0000 UTC m=+212.546230430" watchObservedRunningTime="2026-02-25 07:20:59.199102112 +0000 UTC m=+212.560928132" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.248280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.270028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.274451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.274498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.274784 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.774772783 +0000 UTC m=+213.136598793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.312556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2-metrics-certs\") pod \"network-metrics-daemon-h66ds\" (UID: \"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2\") " pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.377291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.377613 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.877581747 +0000 UTC m=+213.239407767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.462414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.485173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.485506 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:20:59.985494961 +0000 UTC m=+213.347320981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.582857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h66ds" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.586162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.586496 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.086482325 +0000 UTC m=+213.448308345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.616514 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.616758 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerName="controller-manager" containerID="cri-o://59721d31941aa82e42b587faee5625042d14bd43108445a9ae39bc50d9a454c2" gracePeriod=30 Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.688991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.689294 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.189280669 +0000 UTC m=+213.551106689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.706968 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.746776 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7fnnb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]log ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]etcd ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/generic-apiserver-start-informers ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/max-in-flight-filter ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 25 07:20:59 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 25 07:20:59 crc kubenswrapper[4749]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectcache ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-startinformers ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 25 07:20:59 crc kubenswrapper[4749]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 25 07:20:59 crc kubenswrapper[4749]: livez check failed Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.746826 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" podUID="837b8ba9-ad4e-4186-be8a-60807351bf87" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.780439 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.781322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.790518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.790886 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.290863989 +0000 UTC m=+213.652690009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.791073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.791130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.791167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.791204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcl8\" (UniqueName: \"kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.791554 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.291547088 +0000 UTC m=+213.653373098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.794235 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.800790 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5v7c4" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.802505 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.891479 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.892796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.894513 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.394497925 +0000 UTC m=+213.756323945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcl8\" (UniqueName: \"kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5cnk\" (UniqueName: \"kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.894707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.895064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.895137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.895275 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.395268697 +0000 UTC m=+213.757094717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.904050 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.904480 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.952218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcl8\" (UniqueName: \"kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8\") pod \"community-operators-nkp47\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.995569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.995876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.995934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.995956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5cnk\" (UniqueName: \"kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: E0225 07:20:59.996578 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.496561129 +0000 UTC m=+213.858387149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.997073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:20:59 crc kubenswrapper[4749]: I0225 07:20:59.997329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.019260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5cnk\" (UniqueName: \"kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk\") pod \"certified-operators-8fzjk\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.041234 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:00 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:00 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:00 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.041291 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.047226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d80b950004f72847746c07142de41b353ae84786a65e3ff5f4860c4075f58211"} Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.076390 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.079970 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.081142 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.081391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" event={"ID":"3e2f1545-81aa-4204-bbcf-333ccd9c000e","Type":"ContainerStarted","Data":"67756035532b6e69e93519cc78381f4c1bf850bab078efad386dbb25144d440e"} Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.086182 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerID="59721d31941aa82e42b587faee5625042d14bd43108445a9ae39bc50d9a454c2" exitCode=0 Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.086962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" event={"ID":"ee70ad86-2c43-4d87-9465-9c020b5b4cec","Type":"ContainerDied","Data":"59721d31941aa82e42b587faee5625042d14bd43108445a9ae39bc50d9a454c2"} Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.090226 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" podUID="e86f841c-6eab-4827-aa78-14db5b10dcc5" containerName="route-controller-manager" containerID="cri-o://f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365" gracePeriod=30 Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.097725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.099777 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.599764784 +0000 UTC m=+213.961590804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.099796 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.143057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.204123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.204612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.204645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5gg9\" (UniqueName: \"kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.204803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.205556 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.705541558 +0000 UTC m=+214.067367579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.261158 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.273959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.274120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5gg9\" (UniqueName: \"kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjr4l\" (UniqueName: \"kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.325734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.326297 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.328006 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.82798522 +0000 UTC m=+214.189811240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.367827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5gg9\" (UniqueName: \"kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9\") pod \"community-operators-4mn2q\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.425715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hbl4p" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.429042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.429333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.429368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjr4l\" (UniqueName: \"kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.429675 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.929659644 +0000 UTC m=+214.291485664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.429700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.429725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.430024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.430066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.430284 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:00.9302716 +0000 UTC m=+214.292097620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.434082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.457566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjr4l\" (UniqueName: \"kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l\") pod \"certified-operators-kfddj\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.530559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.530894 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.030867834 +0000 UTC m=+214.392693874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.531099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.531442 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.031428599 +0000 UTC m=+214.393254619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.533172 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.632969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles\") pod \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.633070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cztt\" (UniqueName: \"kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt\") pod \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.633126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config\") pod \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.633150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca\") pod \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.633246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.633285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert\") pod \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\" (UID: \"ee70ad86-2c43-4d87-9465-9c020b5b4cec\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.634331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config" (OuterVolumeSpecName: "config") pod "ee70ad86-2c43-4d87-9465-9c020b5b4cec" (UID: "ee70ad86-2c43-4d87-9465-9c020b5b4cec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.634381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee70ad86-2c43-4d87-9465-9c020b5b4cec" (UID: "ee70ad86-2c43-4d87-9465-9c020b5b4cec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.634476 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.134436898 +0000 UTC m=+214.496262998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.635067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ee70ad86-2c43-4d87-9465-9c020b5b4cec" (UID: "ee70ad86-2c43-4d87-9465-9c020b5b4cec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.641529 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.645789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt" (OuterVolumeSpecName: "kube-api-access-7cztt") pod "ee70ad86-2c43-4d87-9465-9c020b5b4cec" (UID: "ee70ad86-2c43-4d87-9465-9c020b5b4cec"). InnerVolumeSpecName "kube-api-access-7cztt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.645832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee70ad86-2c43-4d87-9465-9c020b5b4cec" (UID: "ee70ad86-2c43-4d87-9465-9c020b5b4cec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.657015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:21:00 crc kubenswrapper[4749]: W0225 07:21:00.713351 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff8f61f_b1e7_4a56_b1aa_f427189de773.slice/crio-72d7f4829b8dd23378a17ccda6bd5845896feef6547eab29086f3693f8845099 WatchSource:0}: Error finding container 72d7f4829b8dd23378a17ccda6bd5845896feef6547eab29086f3693f8845099: Status 404 returned error can't find the container with id 72d7f4829b8dd23378a17ccda6bd5845896feef6547eab29086f3693f8845099 Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.720811 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739187 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee70ad86-2c43-4d87-9465-9c020b5b4cec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739199 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739208 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cztt\" (UniqueName: \"kubernetes.io/projected/ee70ad86-2c43-4d87-9465-9c020b5b4cec-kube-api-access-7cztt\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739217 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.739225 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee70ad86-2c43-4d87-9465-9c020b5b4cec-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.739446 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.239435242 +0000 UTC m=+214.601261252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.752881 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h66ds"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.843878 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca\") pod \"e86f841c-6eab-4827-aa78-14db5b10dcc5\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.843929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6phjz\" (UniqueName: \"kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz\") pod \"e86f841c-6eab-4827-aa78-14db5b10dcc5\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.843952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config\") pod \"e86f841c-6eab-4827-aa78-14db5b10dcc5\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.843972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert\") pod \"e86f841c-6eab-4827-aa78-14db5b10dcc5\" (UID: \"e86f841c-6eab-4827-aa78-14db5b10dcc5\") " Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.844063 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.844249 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.344234491 +0000 UTC m=+214.706060511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.845042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config" (OuterVolumeSpecName: "config") pod "e86f841c-6eab-4827-aa78-14db5b10dcc5" (UID: "e86f841c-6eab-4827-aa78-14db5b10dcc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.845295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca" (OuterVolumeSpecName: "client-ca") pod "e86f841c-6eab-4827-aa78-14db5b10dcc5" (UID: "e86f841c-6eab-4827-aa78-14db5b10dcc5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.850649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz" (OuterVolumeSpecName: "kube-api-access-6phjz") pod "e86f841c-6eab-4827-aa78-14db5b10dcc5" (UID: "e86f841c-6eab-4827-aa78-14db5b10dcc5"). InnerVolumeSpecName "kube-api-access-6phjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.852322 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e86f841c-6eab-4827-aa78-14db5b10dcc5" (UID: "e86f841c-6eab-4827-aa78-14db5b10dcc5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.913479 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.926294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:00 crc kubenswrapper[4749]: W0225 07:21:00.928510 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd685401_e0d6_42b1_8f37_f981f46c62b8.slice/crio-fda55101487832e2d54d7fcaa999c4996711358c29b6dfbf9daeb60a28fa58c5 WatchSource:0}: Error finding container fda55101487832e2d54d7fcaa999c4996711358c29b6dfbf9daeb60a28fa58c5: Status 404 returned error can't find the container with id fda55101487832e2d54d7fcaa999c4996711358c29b6dfbf9daeb60a28fa58c5 Feb 25 07:21:00 crc kubenswrapper[4749]: W0225 07:21:00.936126 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d303108_d189_4334_98e6_640b99c33faf.slice/crio-0726c5d8c457e83f3eb5638e8c81ad719c6f0c46a70deb162d344c1d143f1fcf WatchSource:0}: Error finding container 0726c5d8c457e83f3eb5638e8c81ad719c6f0c46a70deb162d344c1d143f1fcf: Status 404 returned error can't find the container with id 0726c5d8c457e83f3eb5638e8c81ad719c6f0c46a70deb162d344c1d143f1fcf Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.944963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:00 crc kubenswrapper[4749]: E0225 07:21:00.945270 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.445258796 +0000 UTC m=+214.807084816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.945786 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.945798 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6phjz\" (UniqueName: \"kubernetes.io/projected/e86f841c-6eab-4827-aa78-14db5b10dcc5-kube-api-access-6phjz\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.945806 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e86f841c-6eab-4827-aa78-14db5b10dcc5-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.945814 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e86f841c-6eab-4827-aa78-14db5b10dcc5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.968324 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:00 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:00 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:00 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:00 crc kubenswrapper[4749]: I0225 07:21:00.968390 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.046462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.046917 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.546875927 +0000 UTC m=+214.908701947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.104857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h66ds" event={"ID":"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2","Type":"ContainerStarted","Data":"a293fbb4e30f3eece8476ab7a3957f688fcbc04d33f4bf871b8a148cefe2e6e5"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.107401 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerID="d9fb533efebc17538921e1bc144a9693804a1c5685db2237132286f41e18aedf" exitCode=0 Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.107439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerDied","Data":"d9fb533efebc17538921e1bc144a9693804a1c5685db2237132286f41e18aedf"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.107453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerStarted","Data":"72d7f4829b8dd23378a17ccda6bd5845896feef6547eab29086f3693f8845099"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.110335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"901d79a31e400a0afbd372bb9070af38a99d4a78cc6295d0ff94daa7fe86ae6d"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.110354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"13e7ba604413322140a4f0df99893985b130fe89392d4946ad2881ac1951a96c"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.125846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" event={"ID":"3e2f1545-81aa-4204-bbcf-333ccd9c000e","Type":"ContainerStarted","Data":"24175a938c2e59ba329d4a0934d5d6566c359dd2a10a9842c915951e7f5f8113"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.125880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" event={"ID":"3e2f1545-81aa-4204-bbcf-333ccd9c000e","Type":"ContainerStarted","Data":"0acf9262c2f1b567463427d64ac9ba6421d2b1b3e91b183133884015ea4ef600"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.131328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6477232eb67399831a2dd099de0f0d8eb50092f9833ab29a6b2cd39b2204ef04"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.131454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b3a5af22fa4d106c0ebf743ab7a3c7916f1ff8d93e81a351ade858150d59dd78"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.132125 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.138716 4749 generic.go:334] "Generic (PLEG): container finished" podID="e86f841c-6eab-4827-aa78-14db5b10dcc5" containerID="f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365" exitCode=0 Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.138898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" event={"ID":"e86f841c-6eab-4827-aa78-14db5b10dcc5","Type":"ContainerDied","Data":"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.139033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" event={"ID":"e86f841c-6eab-4827-aa78-14db5b10dcc5","Type":"ContainerDied","Data":"4212f9736f562e9dc897fff010456e11196102b632a33980869f8cf5f82a9e6a"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.139125 4749 scope.go:117] "RemoveContainer" containerID="f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.139270 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.149188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.149586 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.649574038 +0000 UTC m=+215.011400058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.175538 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9jt7w" podStartSLOduration=11.175524848 podStartE2EDuration="11.175524848s" podCreationTimestamp="2026-02-25 07:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:01.175101017 +0000 UTC m=+214.536927037" watchObservedRunningTime="2026-02-25 07:21:01.175524848 +0000 UTC m=+214.537350868" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.201026 4749 scope.go:117] "RemoveContainer" containerID="f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.201081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerStarted","Data":"120bf757d7501987a3c104e8a4d028c1693b31140be8ac68ff3224c3071b19a4"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.201119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerStarted","Data":"fda55101487832e2d54d7fcaa999c4996711358c29b6dfbf9daeb60a28fa58c5"} Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.201684 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365\": container with ID starting with f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365 not found: ID does not exist" containerID="f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.201708 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365"} err="failed to get container status \"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365\": rpc error: code = NotFound desc = could not find container \"f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365\": container with ID starting with f5856ca411b48b029782575e99853453d74f96eb311d3349814d2015a477f365 not found: ID does not exist" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.219742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" event={"ID":"ee70ad86-2c43-4d87-9465-9c020b5b4cec","Type":"ContainerDied","Data":"34c4ddb7f564c96592f48cf93c71566586ed6181ca5ed4b943f81465f661ea25"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.219788 4749 scope.go:117] "RemoveContainer" containerID="59721d31941aa82e42b587faee5625042d14bd43108445a9ae39bc50d9a454c2" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.219881 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-x5ctv" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.225074 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.246247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a4e3ab0bab8e202c600873d456676558ba56da77bd376148cba02623ee39e9fa"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.249940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.250077 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.750061558 +0000 UTC m=+215.111887578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.250206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.250458 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.750450199 +0000 UTC m=+215.112276209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.263779 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.264363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerStarted","Data":"0726c5d8c457e83f3eb5638e8c81ad719c6f0c46a70deb162d344c1d143f1fcf"} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.267429 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnj7g"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.321506 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.335576 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86f841c-6eab-4827-aa78-14db5b10dcc5" path="/var/lib/kubelet/pods/e86f841c-6eab-4827-aa78-14db5b10dcc5/volumes" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.336621 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.351969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.352919 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.852883583 +0000 UTC m=+215.214709603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.361807 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-x5ctv"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.454519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.455981 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-25 07:21:01.955966155 +0000 UTC m=+215.317792175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jkzg5" (UID: "159a3e78-678a-495d-9621-d523b52df718") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.547358 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-25T07:21:01.321531225Z","Handler":null,"Name":""} Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.550357 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.550402 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.555321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.558199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.657151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.660129 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.660165 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.691870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jkzg5\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.715980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.758044 4749 ???:1] "http: TLS handshake error from 192.168.126.11:40434: no serving certificate available for the kubelet" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.846083 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.846537 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86f841c-6eab-4827-aa78-14db5b10dcc5" containerName="route-controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.846553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86f841c-6eab-4827-aa78-14db5b10dcc5" containerName="route-controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: E0225 07:21:01.846565 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerName="controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.846573 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerName="controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.846677 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" containerName="controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.846692 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86f841c-6eab-4827-aa78-14db5b10dcc5" containerName="route-controller-manager" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.847072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.852560 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.853859 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.854085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.854207 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.854250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.854443 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.855723 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.856309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.859830 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.859943 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.860036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.860141 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.860325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.860420 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.864277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.864303 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.865152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.869369 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.870888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.877950 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.880864 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.969839 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:01 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:01 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:01 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.969920 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.977078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbxsl\" (UniqueName: \"kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979814 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkwn\" (UniqueName: \"kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.979947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pc4\" (UniqueName: \"kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.981276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.981299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.981390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:01 crc kubenswrapper[4749]: I0225 07:21:01.983211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:21:01 crc kubenswrapper[4749]: W0225 07:21:01.994151 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159a3e78_678a_495d_9621_d523b52df718.slice/crio-e2136c8395d4c11604d9e6eb0f659c0f684532130f2f33fa4f758deec41d78c9 WatchSource:0}: Error finding container e2136c8395d4c11604d9e6eb0f659c0f684532130f2f33fa4f758deec41d78c9: Status 404 returned error can't find the container with id e2136c8395d4c11604d9e6eb0f659c0f684532130f2f33fa4f758deec41d78c9 Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.082911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.082961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.082983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbxsl\" (UniqueName: \"kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkwn\" (UniqueName: \"kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pc4\" (UniqueName: \"kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.083817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.085407 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.085478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.085677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.085778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.088411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.088487 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.090522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.090542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.099055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkwn\" (UniqueName: \"kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn\") pod \"controller-manager-5f8477f8b-8vmpd\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.116291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbxsl\" (UniqueName: \"kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl\") pod \"redhat-marketplace-x464k\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.133411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pc4\" (UniqueName: \"kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4\") pod \"route-controller-manager-6998868cf4-h75vn\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.188008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.193275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.201346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.260821 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.261984 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.274042 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.279764 4749 generic.go:334] "Generic (PLEG): container finished" podID="fe08645e-d824-45ee-abb3-f6052d153605" containerID="831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d" exitCode=0 Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.279834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerDied","Data":"831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.279886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerStarted","Data":"5c71323c33ad714bd22b09ee4853e7ab8a55df96657ce873ab1ef8d68b98f9f8"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.283552 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d303108-d189-4334-98e6-640b99c33faf" containerID="d452703f077a4dc22f0d71e0301c8345dc777fd496006043e083a9307a5b398c" exitCode=0 Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.283831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerDied","Data":"d452703f077a4dc22f0d71e0301c8345dc777fd496006043e083a9307a5b398c"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.290678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" event={"ID":"159a3e78-678a-495d-9621-d523b52df718","Type":"ContainerStarted","Data":"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.290712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" event={"ID":"159a3e78-678a-495d-9621-d523b52df718","Type":"ContainerStarted","Data":"e2136c8395d4c11604d9e6eb0f659c0f684532130f2f33fa4f758deec41d78c9"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.290771 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.299528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h66ds" event={"ID":"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2","Type":"ContainerStarted","Data":"16c6b7af04cbd08a4754df6269fa27d9200773e1fab1eebc0ecee31fff976b98"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.299577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h66ds" event={"ID":"33a0e4e2-c72e-49e4-9ff4-a4ffa0c751b2","Type":"ContainerStarted","Data":"25941cad1564e06aeb5406ad744e19ef00827adc795fdcfbd98a1a0c97efe830"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.321362 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerID="120bf757d7501987a3c104e8a4d028c1693b31140be8ac68ff3224c3071b19a4" exitCode=0 Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.321422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerDied","Data":"120bf757d7501987a3c104e8a4d028c1693b31140be8ac68ff3224c3071b19a4"} Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.335421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h66ds" podStartSLOduration=162.335406216 podStartE2EDuration="2m42.335406216s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:02.334838389 +0000 UTC m=+215.696664409" watchObservedRunningTime="2026-02-25 07:21:02.335406216 +0000 UTC m=+215.697232236" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.353470 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" podStartSLOduration=162.35345295 podStartE2EDuration="2m42.35345295s" podCreationTimestamp="2026-02-25 07:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:02.351938767 +0000 UTC m=+215.713764787" watchObservedRunningTime="2026-02-25 07:21:02.35345295 +0000 UTC m=+215.715278970" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.416550 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.416655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.416688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxkw2\" (UniqueName: \"kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.518311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.518879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxkw2\" (UniqueName: \"kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.518972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.519995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.520159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.536439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxkw2\" (UniqueName: \"kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2\") pod \"redhat-marketplace-chmpx\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.620162 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.623032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.625828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:02 crc kubenswrapper[4749]: W0225 07:21:02.668581 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5dfe132_00b3_4473_a4bc_d541b94a1581.slice/crio-37ffcdce68b27158f002a9365870f147bda97a258dfbd9752ee762366ab0cbfe WatchSource:0}: Error finding container 37ffcdce68b27158f002a9365870f147bda97a258dfbd9752ee762366ab0cbfe: Status 404 returned error can't find the container with id 37ffcdce68b27158f002a9365870f147bda97a258dfbd9752ee762366ab0cbfe Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.720578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:02 crc kubenswrapper[4749]: W0225 07:21:02.750474 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7567160a_f731_41dc_89f2_a926883bf7ab.slice/crio-be6f0efcec5cc980ba6ad484f44b3543571605bd3b994cc221f3bff89c764d12 WatchSource:0}: Error finding container be6f0efcec5cc980ba6ad484f44b3543571605bd3b994cc221f3bff89c764d12: Status 404 returned error can't find the container with id be6f0efcec5cc980ba6ad484f44b3543571605bd3b994cc221f3bff89c764d12 Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.859777 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.861604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.866286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.872638 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.967453 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:02 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:02 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:02 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.967504 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:02 crc kubenswrapper[4749]: I0225 07:21:02.990436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.030750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.031062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxln7\" (UniqueName: \"kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.031113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: W0225 07:21:03.039453 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8a4c468_f347_4632_a517_c0e9bb839e98.slice/crio-d754c34bcd105fe4c62b69d393aa04019f97e8653ba6d8f861dde270f461260a WatchSource:0}: Error finding container d754c34bcd105fe4c62b69d393aa04019f97e8653ba6d8f861dde270f461260a: Status 404 returned error can't find the container with id d754c34bcd105fe4c62b69d393aa04019f97e8653ba6d8f861dde270f461260a Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.132422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.132477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxln7\" (UniqueName: \"kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.132543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.132980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.134807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.141207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.146922 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7fnnb" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.169436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxln7\" (UniqueName: \"kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7\") pod \"redhat-operators-cpkr5\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.201304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.298065 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.299154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.317639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.335362 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.335996 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee70ad86-2c43-4d87-9465-9c020b5b4cec" path="/var/lib/kubelet/pods/ee70ad86-2c43-4d87-9465-9c020b5b4cec/volumes" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.365298 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerID="6793b7a71491c5ef41ea8267e4732791aaefa54c7f9f43261d418f82e6ef7d80" exitCode=0 Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.365370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerDied","Data":"6793b7a71491c5ef41ea8267e4732791aaefa54c7f9f43261d418f82e6ef7d80"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.365394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerStarted","Data":"1c6a3c79aa8caa68c52d56f1deaed660f0e81538bd5e47edaee2f9a66ff03d0f"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.387827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" event={"ID":"e5dfe132-00b3-4473-a4bc-d541b94a1581","Type":"ContainerStarted","Data":"984a216745134b95d75ef35fb8a8b8c2ca70380b3c5ab7a3c4c72766b480e6fb"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.387863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" event={"ID":"e5dfe132-00b3-4473-a4bc-d541b94a1581","Type":"ContainerStarted","Data":"37ffcdce68b27158f002a9365870f147bda97a258dfbd9752ee762366ab0cbfe"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.388611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.394560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" event={"ID":"7567160a-f731-41dc-89f2-a926883bf7ab","Type":"ContainerStarted","Data":"09c996211c3ec0d6312aa00c71c4b7185647d354126fa86a9fb9a0096254ad7e"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.394583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" event={"ID":"7567160a-f731-41dc-89f2-a926883bf7ab","Type":"ContainerStarted","Data":"be6f0efcec5cc980ba6ad484f44b3543571605bd3b994cc221f3bff89c764d12"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.395445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.402146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerStarted","Data":"d754c34bcd105fe4c62b69d393aa04019f97e8653ba6d8f861dde270f461260a"} Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.411428 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.411471 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.433563 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" podStartSLOduration=3.433549322 podStartE2EDuration="3.433549322s" podCreationTimestamp="2026-02-25 07:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:03.432502233 +0000 UTC m=+216.794328253" watchObservedRunningTime="2026-02-25 07:21:03.433549322 +0000 UTC m=+216.795375342" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.441998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.442069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5t6c\" (UniqueName: \"kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.442103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.450219 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.466887 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.477568 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" podStartSLOduration=3.4775515759999998 podStartE2EDuration="3.477551576s" podCreationTimestamp="2026-02-25 07:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:03.469967269 +0000 UTC m=+216.831793289" watchObservedRunningTime="2026-02-25 07:21:03.477551576 +0000 UTC m=+216.839377596" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.543309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.543511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5t6c\" (UniqueName: \"kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.543573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.544472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.557931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.579827 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.580547 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.587158 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-jwdk8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.587191 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-jwdk8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.587210 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jwdk8" podUID="3f34750e-27ea-4d28-b6a9-0f65d2b29e92" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.587248 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jwdk8" podUID="3f34750e-27ea-4d28-b6a9-0f65d2b29e92" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.600141 4749 patch_prober.go:28] interesting pod/console-f9d7485db-wmg6x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.600195 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wmg6x" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.604274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5t6c\" (UniqueName: \"kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c\") pod \"redhat-operators-cqprr\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.650134 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.708427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.771456 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.772115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.778233 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.778338 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.783994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.847207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.847262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.917250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.949933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.949993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.950076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.965667 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.971996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.976125 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:03 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:03 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:03 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:03 crc kubenswrapper[4749]: I0225 07:21:03.976166 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.056291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.089839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: W0225 07:21:04.095375 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f62d900_f9e9_462b_9dc1_4cfd3eaeb774.slice/crio-7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9 WatchSource:0}: Error finding container 7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9: Status 404 returned error can't find the container with id 7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9 Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.330992 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.331535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.338511 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.339022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.340411 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.340679 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.404666 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerID="df4404d0e2f3aef60f2128a15e67a628a06f0f9f0dbce2df27bf814f2aeee4a1" exitCode=0 Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.404758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerDied","Data":"df4404d0e2f3aef60f2128a15e67a628a06f0f9f0dbce2df27bf814f2aeee4a1"} Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.404973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerStarted","Data":"e0088db6006827c76c5b14c9ffd24aa8d44f85c3f0c5a0197021e74d755bbbd8"} Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.423308 4749 generic.go:334] "Generic (PLEG): container finished" podID="bec106c3-2cb1-4c08-8b5a-920797ec4142" containerID="3e1f4ce278983d29157591a2b038e7aa63556219a96f8a79dba5e9130a1a0df4" exitCode=0 Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.423416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" event={"ID":"bec106c3-2cb1-4c08-8b5a-920797ec4142","Type":"ContainerDied","Data":"3e1f4ce278983d29157591a2b038e7aa63556219a96f8a79dba5e9130a1a0df4"} Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.429029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerStarted","Data":"7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9"} Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.438374 4749 generic.go:334] "Generic (PLEG): container finished" podID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerID="a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17" exitCode=0 Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.439095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerDied","Data":"a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17"} Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.454693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-rl7ff" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.457184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.457293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.567993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.568121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.570087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.588277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.610648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 25 07:21:04 crc kubenswrapper[4749]: W0225 07:21:04.620687 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod28a22187_2c19_4ecc_97c9_d29aeb521050.slice/crio-3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39 WatchSource:0}: Error finding container 3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39: Status 404 returned error can't find the container with id 3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39 Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.673945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.966229 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:04 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:04 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:04 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:04 crc kubenswrapper[4749]: I0225 07:21:04.966467 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.220878 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.470538 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerID="0e07959c96b3a4a206eef86c40104d0a734d892a43f789b98f202d0c9fd8bbd8" exitCode=0 Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.470622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerDied","Data":"0e07959c96b3a4a206eef86c40104d0a734d892a43f789b98f202d0c9fd8bbd8"} Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.482819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28a22187-2c19-4ecc-97c9-d29aeb521050","Type":"ContainerStarted","Data":"add7fee4b9bfe7402476149408cfd04c56bd5b1164bc83f534c60f01d04301ec"} Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.482913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28a22187-2c19-4ecc-97c9-d29aeb521050","Type":"ContainerStarted","Data":"3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39"} Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.497095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4555686b-5316-4799-a666-1acb0644461b","Type":"ContainerStarted","Data":"079dbddb3ce9ad4351759a767402775119df1f7edbbf6e44deb1437fc5bc865d"} Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.788710 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.816748 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.816733942 podStartE2EDuration="2.816733942s" podCreationTimestamp="2026-02-25 07:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:05.509273006 +0000 UTC m=+218.871099016" watchObservedRunningTime="2026-02-25 07:21:05.816733942 +0000 UTC m=+219.178559962" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.885371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78x26\" (UniqueName: \"kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26\") pod \"bec106c3-2cb1-4c08-8b5a-920797ec4142\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.885444 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume\") pod \"bec106c3-2cb1-4c08-8b5a-920797ec4142\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.885531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume\") pod \"bec106c3-2cb1-4c08-8b5a-920797ec4142\" (UID: \"bec106c3-2cb1-4c08-8b5a-920797ec4142\") " Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.886557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume" (OuterVolumeSpecName: "config-volume") pod "bec106c3-2cb1-4c08-8b5a-920797ec4142" (UID: "bec106c3-2cb1-4c08-8b5a-920797ec4142"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.894096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26" (OuterVolumeSpecName: "kube-api-access-78x26") pod "bec106c3-2cb1-4c08-8b5a-920797ec4142" (UID: "bec106c3-2cb1-4c08-8b5a-920797ec4142"). InnerVolumeSpecName "kube-api-access-78x26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.909897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bec106c3-2cb1-4c08-8b5a-920797ec4142" (UID: "bec106c3-2cb1-4c08-8b5a-920797ec4142"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.966071 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:05 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:05 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:05 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.966160 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.987353 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bec106c3-2cb1-4c08-8b5a-920797ec4142-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.987418 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bec106c3-2cb1-4c08-8b5a-920797ec4142-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:05 crc kubenswrapper[4749]: I0225 07:21:05.987463 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78x26\" (UniqueName: \"kubernetes.io/projected/bec106c3-2cb1-4c08-8b5a-920797ec4142-kube-api-access-78x26\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.509053 4749 generic.go:334] "Generic (PLEG): container finished" podID="28a22187-2c19-4ecc-97c9-d29aeb521050" containerID="add7fee4b9bfe7402476149408cfd04c56bd5b1164bc83f534c60f01d04301ec" exitCode=0 Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.509142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28a22187-2c19-4ecc-97c9-d29aeb521050","Type":"ContainerDied","Data":"add7fee4b9bfe7402476149408cfd04c56bd5b1164bc83f534c60f01d04301ec"} Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.545961 4749 generic.go:334] "Generic (PLEG): container finished" podID="4555686b-5316-4799-a666-1acb0644461b" containerID="71e1f1326192d2c0aacf2ca0cb10a28cb7a5afe24029a719a33f3e3256504c08" exitCode=0 Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.546047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4555686b-5316-4799-a666-1acb0644461b","Type":"ContainerDied","Data":"71e1f1326192d2c0aacf2ca0cb10a28cb7a5afe24029a719a33f3e3256504c08"} Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.554122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" event={"ID":"bec106c3-2cb1-4c08-8b5a-920797ec4142","Type":"ContainerDied","Data":"5fe74052c264a472c66b3552add0383235230ec340b99faef63a4a277d66d8a8"} Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.554178 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fe74052c264a472c66b3552add0383235230ec340b99faef63a4a277d66d8a8" Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.554226 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg" Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.824780 4749 ???:1] "http: TLS handshake error from 192.168.126.11:36564: no serving certificate available for the kubelet" Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.918879 4749 ???:1] "http: TLS handshake error from 192.168.126.11:36566: no serving certificate available for the kubelet" Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.967961 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:06 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:06 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:06 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:06 crc kubenswrapper[4749]: I0225 07:21:06.968060 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:07 crc kubenswrapper[4749]: I0225 07:21:07.966252 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:07 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:07 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:07 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:07 crc kubenswrapper[4749]: I0225 07:21:07.966610 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:08 crc kubenswrapper[4749]: I0225 07:21:08.893263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8jn8v" Feb 25 07:21:08 crc kubenswrapper[4749]: I0225 07:21:08.976533 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:08 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:08 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:08 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:08 crc kubenswrapper[4749]: I0225 07:21:08.976581 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:09 crc kubenswrapper[4749]: I0225 07:21:09.965749 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:09 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:09 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:09 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:09 crc kubenswrapper[4749]: I0225 07:21:09.965819 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:10 crc kubenswrapper[4749]: I0225 07:21:10.964958 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:10 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:10 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:10 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:10 crc kubenswrapper[4749]: I0225 07:21:10.965019 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:11 crc kubenswrapper[4749]: I0225 07:21:11.964975 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:11 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:11 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:11 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:11 crc kubenswrapper[4749]: I0225 07:21:11.965269 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:12 crc kubenswrapper[4749]: I0225 07:21:12.969728 4749 patch_prober.go:28] interesting pod/router-default-5444994796-zlfns container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 25 07:21:12 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Feb 25 07:21:12 crc kubenswrapper[4749]: [+]process-running ok Feb 25 07:21:12 crc kubenswrapper[4749]: healthz check failed Feb 25 07:21:12 crc kubenswrapper[4749]: I0225 07:21:12.970017 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zlfns" podUID="6a2834ef-28bc-4393-9acf-f78292d07b41" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 25 07:21:13 crc kubenswrapper[4749]: I0225 07:21:13.580529 4749 patch_prober.go:28] interesting pod/console-f9d7485db-wmg6x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 25 07:21:13 crc kubenswrapper[4749]: I0225 07:21:13.580583 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wmg6x" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 25 07:21:13 crc kubenswrapper[4749]: I0225 07:21:13.599901 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jwdk8" Feb 25 07:21:13 crc kubenswrapper[4749]: I0225 07:21:13.965999 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:21:13 crc kubenswrapper[4749]: I0225 07:21:13.968125 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zlfns" Feb 25 07:21:18 crc kubenswrapper[4749]: I0225 07:21:18.987891 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:18 crc kubenswrapper[4749]: I0225 07:21:18.988883 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerName="controller-manager" containerID="cri-o://984a216745134b95d75ef35fb8a8b8c2ca70380b3c5ab7a3c4c72766b480e6fb" gracePeriod=30 Feb 25 07:21:19 crc kubenswrapper[4749]: I0225 07:21:19.000239 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:19 crc kubenswrapper[4749]: I0225 07:21:19.000451 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" containerName="route-controller-manager" containerID="cri-o://09c996211c3ec0d6312aa00c71c4b7185647d354126fa86a9fb9a0096254ad7e" gracePeriod=30 Feb 25 07:21:19 crc kubenswrapper[4749]: I0225 07:21:19.651217 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerID="984a216745134b95d75ef35fb8a8b8c2ca70380b3c5ab7a3c4c72766b480e6fb" exitCode=0 Feb 25 07:21:19 crc kubenswrapper[4749]: I0225 07:21:19.651284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" event={"ID":"e5dfe132-00b3-4473-a4bc-d541b94a1581","Type":"ContainerDied","Data":"984a216745134b95d75ef35fb8a8b8c2ca70380b3c5ab7a3c4c72766b480e6fb"} Feb 25 07:21:20 crc kubenswrapper[4749]: I0225 07:21:20.659814 4749 generic.go:334] "Generic (PLEG): container finished" podID="7567160a-f731-41dc-89f2-a926883bf7ab" containerID="09c996211c3ec0d6312aa00c71c4b7185647d354126fa86a9fb9a0096254ad7e" exitCode=0 Feb 25 07:21:20 crc kubenswrapper[4749]: I0225 07:21:20.659884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" event={"ID":"7567160a-f731-41dc-89f2-a926883bf7ab","Type":"ContainerDied","Data":"09c996211c3ec0d6312aa00c71c4b7185647d354126fa86a9fb9a0096254ad7e"} Feb 25 07:21:21 crc kubenswrapper[4749]: E0225 07:21:21.103765 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 25 07:21:21 crc kubenswrapper[4749]: E0225 07:21:21.104267 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 25 07:21:21 crc kubenswrapper[4749]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 25 07:21:21 crc kubenswrapper[4749]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-flz6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29533400-6w2xx_openshift-infra(2a8700de-dc40-4245-8c99-e792c342b5bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 25 07:21:21 crc kubenswrapper[4749]: > logger="UnhandledError" Feb 25 07:21:21 crc kubenswrapper[4749]: E0225 07:21:21.106147 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" Feb 25 07:21:21 crc kubenswrapper[4749]: E0225 07:21:21.666477 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" Feb 25 07:21:21 crc kubenswrapper[4749]: I0225 07:21:21.672779 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:21:21 crc kubenswrapper[4749]: I0225 07:21:21.672829 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:21:21 crc kubenswrapper[4749]: I0225 07:21:21.726584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.189047 4749 patch_prober.go:28] interesting pod/controller-manager-5f8477f8b-8vmpd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.189300 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.195155 4749 patch_prober.go:28] interesting pod/route-controller-manager-6998868cf4-h75vn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.195213 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.868286 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:22 crc kubenswrapper[4749]: I0225 07:21:22.876535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access\") pod \"28a22187-2c19-4ecc-97c9-d29aeb521050\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir\") pod \"28a22187-2c19-4ecc-97c9-d29aeb521050\" (UID: \"28a22187-2c19-4ecc-97c9-d29aeb521050\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access\") pod \"4555686b-5316-4799-a666-1acb0644461b\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016430 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir\") pod \"4555686b-5316-4799-a666-1acb0644461b\" (UID: \"4555686b-5316-4799-a666-1acb0644461b\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "28a22187-2c19-4ecc-97c9-d29aeb521050" (UID: "28a22187-2c19-4ecc-97c9-d29aeb521050"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.016728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4555686b-5316-4799-a666-1acb0644461b" (UID: "4555686b-5316-4799-a666-1acb0644461b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.029233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4555686b-5316-4799-a666-1acb0644461b" (UID: "4555686b-5316-4799-a666-1acb0644461b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.029351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "28a22187-2c19-4ecc-97c9-d29aeb521050" (UID: "28a22187-2c19-4ecc-97c9-d29aeb521050"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.117417 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/28a22187-2c19-4ecc-97c9-d29aeb521050-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.117460 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28a22187-2c19-4ecc-97c9-d29aeb521050-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.117472 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4555686b-5316-4799-a666-1acb0644461b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.117494 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4555686b-5316-4799-a666-1acb0644461b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.312787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.317318 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.419978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert\") pod \"e5dfe132-00b3-4473-a4bc-d541b94a1581\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca\") pod \"7567160a-f731-41dc-89f2-a926883bf7ab\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles\") pod \"e5dfe132-00b3-4473-a4bc-d541b94a1581\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca\") pod \"e5dfe132-00b3-4473-a4bc-d541b94a1581\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config\") pod \"e5dfe132-00b3-4473-a4bc-d541b94a1581\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config\") pod \"7567160a-f731-41dc-89f2-a926883bf7ab\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4pc4\" (UniqueName: \"kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4\") pod \"7567160a-f731-41dc-89f2-a926883bf7ab\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420272 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert\") pod \"7567160a-f731-41dc-89f2-a926883bf7ab\" (UID: \"7567160a-f731-41dc-89f2-a926883bf7ab\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.420320 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjkwn\" (UniqueName: \"kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn\") pod \"e5dfe132-00b3-4473-a4bc-d541b94a1581\" (UID: \"e5dfe132-00b3-4473-a4bc-d541b94a1581\") " Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.421806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca" (OuterVolumeSpecName: "client-ca") pod "e5dfe132-00b3-4473-a4bc-d541b94a1581" (UID: "e5dfe132-00b3-4473-a4bc-d541b94a1581"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.422053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "7567160a-f731-41dc-89f2-a926883bf7ab" (UID: "7567160a-f731-41dc-89f2-a926883bf7ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.422075 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config" (OuterVolumeSpecName: "config") pod "e5dfe132-00b3-4473-a4bc-d541b94a1581" (UID: "e5dfe132-00b3-4473-a4bc-d541b94a1581"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.422535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config" (OuterVolumeSpecName: "config") pod "7567160a-f731-41dc-89f2-a926883bf7ab" (UID: "7567160a-f731-41dc-89f2-a926883bf7ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.422807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e5dfe132-00b3-4473-a4bc-d541b94a1581" (UID: "e5dfe132-00b3-4473-a4bc-d541b94a1581"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.445233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4" (OuterVolumeSpecName: "kube-api-access-j4pc4") pod "7567160a-f731-41dc-89f2-a926883bf7ab" (UID: "7567160a-f731-41dc-89f2-a926883bf7ab"). InnerVolumeSpecName "kube-api-access-j4pc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.445268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn" (OuterVolumeSpecName: "kube-api-access-xjkwn") pod "e5dfe132-00b3-4473-a4bc-d541b94a1581" (UID: "e5dfe132-00b3-4473-a4bc-d541b94a1581"). InnerVolumeSpecName "kube-api-access-xjkwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.445274 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e5dfe132-00b3-4473-a4bc-d541b94a1581" (UID: "e5dfe132-00b3-4473-a4bc-d541b94a1581"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.445371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7567160a-f731-41dc-89f2-a926883bf7ab" (UID: "7567160a-f731-41dc-89f2-a926883bf7ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521873 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjkwn\" (UniqueName: \"kubernetes.io/projected/e5dfe132-00b3-4473-a4bc-d541b94a1581-kube-api-access-xjkwn\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521920 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5dfe132-00b3-4473-a4bc-d541b94a1581-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521935 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521946 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521957 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521971 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dfe132-00b3-4473-a4bc-d541b94a1581-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521982 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7567160a-f731-41dc-89f2-a926883bf7ab-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.521994 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4pc4\" (UniqueName: \"kubernetes.io/projected/7567160a-f731-41dc-89f2-a926883bf7ab-kube-api-access-j4pc4\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.522005 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7567160a-f731-41dc-89f2-a926883bf7ab-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.583586 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.588522 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.676809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" event={"ID":"e5dfe132-00b3-4473-a4bc-d541b94a1581","Type":"ContainerDied","Data":"37ffcdce68b27158f002a9365870f147bda97a258dfbd9752ee762366ab0cbfe"} Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.676870 4749 scope.go:117] "RemoveContainer" containerID="984a216745134b95d75ef35fb8a8b8c2ca70380b3c5ab7a3c4c72766b480e6fb" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.676990 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f8477f8b-8vmpd" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.681755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" event={"ID":"7567160a-f731-41dc-89f2-a926883bf7ab","Type":"ContainerDied","Data":"be6f0efcec5cc980ba6ad484f44b3543571605bd3b994cc221f3bff89c764d12"} Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.681826 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.695886 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.696022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"28a22187-2c19-4ecc-97c9-d29aeb521050","Type":"ContainerDied","Data":"3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39"} Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.696060 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd18fc75eab8c856d983732d39f07065d55923ea950e6e093f0f94a6f323a39" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.699044 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.699120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4555686b-5316-4799-a666-1acb0644461b","Type":"ContainerDied","Data":"079dbddb3ce9ad4351759a767402775119df1f7edbbf6e44deb1437fc5bc865d"} Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.699146 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079dbddb3ce9ad4351759a767402775119df1f7edbbf6e44deb1437fc5bc865d" Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.722518 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.727429 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6998868cf4-h75vn"] Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.732374 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:23 crc kubenswrapper[4749]: I0225 07:21:23.735094 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f8477f8b-8vmpd"] Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.333390 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" path="/var/lib/kubelet/pods/7567160a-f731-41dc-89f2-a926883bf7ab/volumes" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.337049 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" path="/var/lib/kubelet/pods/e5dfe132-00b3-4473-a4bc-d541b94a1581/volumes" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.862461 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:25 crc kubenswrapper[4749]: E0225 07:21:25.862992 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerName="controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863014 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerName="controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: E0225 07:21:25.863024 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" containerName="route-controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" containerName="route-controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: E0225 07:21:25.863147 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec106c3-2cb1-4c08-8b5a-920797ec4142" containerName="collect-profiles" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863154 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec106c3-2cb1-4c08-8b5a-920797ec4142" containerName="collect-profiles" Feb 25 07:21:25 crc kubenswrapper[4749]: E0225 07:21:25.863172 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4555686b-5316-4799-a666-1acb0644461b" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863178 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4555686b-5316-4799-a666-1acb0644461b" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: E0225 07:21:25.863201 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a22187-2c19-4ecc-97c9-d29aeb521050" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863208 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a22187-2c19-4ecc-97c9-d29aeb521050" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863804 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec106c3-2cb1-4c08-8b5a-920797ec4142" containerName="collect-profiles" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863833 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a22187-2c19-4ecc-97c9-d29aeb521050" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863847 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7567160a-f731-41dc-89f2-a926883bf7ab" containerName="route-controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4555686b-5316-4799-a666-1acb0644461b" containerName="pruner" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.863885 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5dfe132-00b3-4473-a4bc-d541b94a1581" containerName="controller-manager" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.864461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.868864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.871727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.877493 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.877699 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.878659 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.879028 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.880847 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.881585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.887984 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.889102 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.889320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.889500 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.889786 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.890010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.889559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.890528 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.897427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.955721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.955827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.955872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x8l\" (UniqueName: \"kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.955916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:25 crc kubenswrapper[4749]: I0225 07:21:25.956089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.057525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.057584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.057642 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wql8f\" (UniqueName: \"kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.057908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.057990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.058039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.058062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x8l\" (UniqueName: \"kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.058090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.058128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.058749 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.059458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.059649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.082814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x8l\" (UniqueName: \"kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.084929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert\") pod \"controller-manager-846bf578b7-l7qcj\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.159556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.159628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wql8f\" (UniqueName: \"kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.159672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.159703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.160794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.161266 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.168668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.177937 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wql8f\" (UniqueName: \"kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f\") pod \"route-controller-manager-65fcd66b4-qppnm\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.190454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:26 crc kubenswrapper[4749]: I0225 07:21:26.209651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:27 crc kubenswrapper[4749]: E0225 07:21:27.198827 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 07:21:27 crc kubenswrapper[4749]: E0225 07:21:27.199167 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbxsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-x464k_openshift-marketplace(8f869be2-b41b-4117-a9b4-ed628ae0d30b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 07:21:27 crc kubenswrapper[4749]: E0225 07:21:27.200664 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-x464k" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" Feb 25 07:21:27 crc kubenswrapper[4749]: I0225 07:21:27.418214 4749 ???:1] "http: TLS handshake error from 192.168.126.11:46018: no serving certificate available for the kubelet" Feb 25 07:21:28 crc kubenswrapper[4749]: E0225 07:21:28.578996 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-x464k" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" Feb 25 07:21:29 crc kubenswrapper[4749]: I0225 07:21:29.629981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:21:33 crc kubenswrapper[4749]: I0225 07:21:33.716227 4749 scope.go:117] "RemoveContainer" containerID="09c996211c3ec0d6312aa00c71c4b7185647d354126fa86a9fb9a0096254ad7e" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.793485 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.793648 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxln7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cpkr5_openshift-marketplace(7e2f1824-0c70-45fa-8b96-c41047b08d69): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.795512 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cpkr5" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.807537 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.807681 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5t6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cqprr_openshift-marketplace(8f62d900-f9e9-462b-9dc1-4cfd3eaeb774): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.808992 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cqprr" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.829981 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.830103 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mxkw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-chmpx_openshift-marketplace(c8a4c468-f347-4632-a517-c0e9bb839e98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 25 07:21:33 crc kubenswrapper[4749]: E0225 07:21:33.831281 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-chmpx" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.039525 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.056107 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5x8pj" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.101936 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.759418 4749 generic.go:334] "Generic (PLEG): container finished" podID="fe08645e-d824-45ee-abb3-f6052d153605" containerID="21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d" exitCode=0 Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.759471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerDied","Data":"21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.761567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" event={"ID":"53bf5621-fa8b-4221-b4d9-8f665d42f371","Type":"ContainerStarted","Data":"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.761639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" event={"ID":"53bf5621-fa8b-4221-b4d9-8f665d42f371","Type":"ContainerStarted","Data":"a74d422f264a5f2c0fc01a586e6714b27fb93a6d4e9152ef2298cd470626f0e7"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.762177 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.765084 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerID="896f742f84f19750c6bfa48b08974c4b4d31d50a9db359fddbe46a13627e20fb" exitCode=0 Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.765189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerDied","Data":"896f742f84f19750c6bfa48b08974c4b4d31d50a9db359fddbe46a13627e20fb"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.768099 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d303108-d189-4334-98e6-640b99c33faf" containerID="936b8789cf4393289c39de1f3946f1870a5854873f6137a60a609300caf62f2d" exitCode=0 Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.768228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerDied","Data":"936b8789cf4393289c39de1f3946f1870a5854873f6137a60a609300caf62f2d"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.775492 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerID="f7b732d4aca42f8c73c5e7d4764b394064702e3b364c5be723737698f2017eab" exitCode=0 Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.775611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerDied","Data":"f7b732d4aca42f8c73c5e7d4764b394064702e3b364c5be723737698f2017eab"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.777660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" event={"ID":"0a09ebe4-c502-485e-9b10-d09ba48b9c75","Type":"ContainerStarted","Data":"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.777733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" event={"ID":"0a09ebe4-c502-485e-9b10-d09ba48b9c75","Type":"ContainerStarted","Data":"aa6c8b7945524b6ace8584095b434f2e2224f9551c59121d67133cdfe556cd19"} Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.778944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:34 crc kubenswrapper[4749]: E0225 07:21:34.780122 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cqprr" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" Feb 25 07:21:34 crc kubenswrapper[4749]: E0225 07:21:34.780338 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-chmpx" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" Feb 25 07:21:34 crc kubenswrapper[4749]: E0225 07:21:34.780647 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cpkr5" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.791780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.843807 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" podStartSLOduration=15.84378897 podStartE2EDuration="15.84378897s" podCreationTimestamp="2026-02-25 07:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:34.839741519 +0000 UTC m=+248.201567549" watchObservedRunningTime="2026-02-25 07:21:34.84378897 +0000 UTC m=+248.205615000" Feb 25 07:21:34 crc kubenswrapper[4749]: I0225 07:21:34.883795 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" podStartSLOduration=15.883732282 podStartE2EDuration="15.883732282s" podCreationTimestamp="2026-02-25 07:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:34.88109948 +0000 UTC m=+248.242925530" watchObservedRunningTime="2026-02-25 07:21:34.883732282 +0000 UTC m=+248.245558332" Feb 25 07:21:35 crc kubenswrapper[4749]: I0225 07:21:35.248508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.162365 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.163726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.166188 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.169577 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.171031 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.297428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.297673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.399531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.399645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.399780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.425255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.488884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.795313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerStarted","Data":"7781b4539e67b111f9881202cac6c9da17118a0ecd6c1b19300d33c851b9561c"} Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.796828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" event={"ID":"2a8700de-dc40-4245-8c99-e792c342b5bb","Type":"ContainerStarted","Data":"d91e372cb404b8a650775dd575faf749d114c6f3349d530171b962c23118b0dc"} Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.799481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerStarted","Data":"af1fb59810b6dae52d8a2a2a5772c89962bf54cfa901a2cba3aeddd79ea21deb"} Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.802968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerStarted","Data":"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c"} Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.806346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerStarted","Data":"5d71727286b8f34049b1a2cca21d0779c393ee80e5f10e4acefb551f4d87db85"} Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.820880 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8fzjk" podStartSLOduration=2.535037348 podStartE2EDuration="37.820861084s" podCreationTimestamp="2026-02-25 07:20:59 +0000 UTC" firstStartedPulling="2026-02-25 07:21:01.109686826 +0000 UTC m=+214.471512846" lastFinishedPulling="2026-02-25 07:21:36.395510562 +0000 UTC m=+249.757336582" observedRunningTime="2026-02-25 07:21:36.816779482 +0000 UTC m=+250.178605492" watchObservedRunningTime="2026-02-25 07:21:36.820861084 +0000 UTC m=+250.182687104" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.835163 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nkp47" podStartSLOduration=2.78095648 podStartE2EDuration="37.835147125s" podCreationTimestamp="2026-02-25 07:20:59 +0000 UTC" firstStartedPulling="2026-02-25 07:21:01.212587073 +0000 UTC m=+214.574413093" lastFinishedPulling="2026-02-25 07:21:36.266777718 +0000 UTC m=+249.628603738" observedRunningTime="2026-02-25 07:21:36.832127632 +0000 UTC m=+250.193953652" watchObservedRunningTime="2026-02-25 07:21:36.835147125 +0000 UTC m=+250.196973145" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.850275 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfddj" podStartSLOduration=2.731025635 podStartE2EDuration="36.850261399s" podCreationTimestamp="2026-02-25 07:21:00 +0000 UTC" firstStartedPulling="2026-02-25 07:21:02.292435189 +0000 UTC m=+215.654261209" lastFinishedPulling="2026-02-25 07:21:36.411670953 +0000 UTC m=+249.773496973" observedRunningTime="2026-02-25 07:21:36.848977493 +0000 UTC m=+250.210803503" watchObservedRunningTime="2026-02-25 07:21:36.850261399 +0000 UTC m=+250.212087419" Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.863840 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.869164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4mn2q" podStartSLOduration=2.887906348 podStartE2EDuration="36.869139635s" podCreationTimestamp="2026-02-25 07:21:00 +0000 UTC" firstStartedPulling="2026-02-25 07:21:02.292070129 +0000 UTC m=+215.653896149" lastFinishedPulling="2026-02-25 07:21:36.273303426 +0000 UTC m=+249.635129436" observedRunningTime="2026-02-25 07:21:36.868413195 +0000 UTC m=+250.230239215" watchObservedRunningTime="2026-02-25 07:21:36.869139635 +0000 UTC m=+250.230965655" Feb 25 07:21:36 crc kubenswrapper[4749]: W0225 07:21:36.870334 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04b76053_c9b9_40f2_ada0_ea7839916603.slice/crio-0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c WatchSource:0}: Error finding container 0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c: Status 404 returned error can't find the container with id 0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c Feb 25 07:21:36 crc kubenswrapper[4749]: I0225 07:21:36.881957 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" podStartSLOduration=56.845189273 podStartE2EDuration="1m36.881940185s" podCreationTimestamp="2026-02-25 07:20:00 +0000 UTC" firstStartedPulling="2026-02-25 07:20:56.23604663 +0000 UTC m=+209.597872660" lastFinishedPulling="2026-02-25 07:21:36.272797552 +0000 UTC m=+249.634623572" observedRunningTime="2026-02-25 07:21:36.880162457 +0000 UTC m=+250.241988477" watchObservedRunningTime="2026-02-25 07:21:36.881940185 +0000 UTC m=+250.243766205" Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.050947 4749 csr.go:261] certificate signing request csr-sh9s8 is approved, waiting to be issued Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.057328 4749 csr.go:257] certificate signing request csr-sh9s8 is issued Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.813317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04b76053-c9b9-40f2-ada0-ea7839916603","Type":"ContainerStarted","Data":"c95be46e1c2e1b39c53fb51ef7da1b307c26a97013e17681d0f12f5f1c005a62"} Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.813576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04b76053-c9b9-40f2-ada0-ea7839916603","Type":"ContainerStarted","Data":"0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c"} Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.814512 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a8700de-dc40-4245-8c99-e792c342b5bb" containerID="d91e372cb404b8a650775dd575faf749d114c6f3349d530171b962c23118b0dc" exitCode=0 Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.815080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" event={"ID":"2a8700de-dc40-4245-8c99-e792c342b5bb","Type":"ContainerDied","Data":"d91e372cb404b8a650775dd575faf749d114c6f3349d530171b962c23118b0dc"} Feb 25 07:21:37 crc kubenswrapper[4749]: I0225 07:21:37.841959 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.841940581 podStartE2EDuration="1.841940581s" podCreationTimestamp="2026-02-25 07:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:37.829757248 +0000 UTC m=+251.191583268" watchObservedRunningTime="2026-02-25 07:21:37.841940581 +0000 UTC m=+251.203766601" Feb 25 07:21:38 crc kubenswrapper[4749]: I0225 07:21:38.058307 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-13 09:25:58.548164183 +0000 UTC Feb 25 07:21:38 crc kubenswrapper[4749]: I0225 07:21:38.058354 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7730h4m20.489813489s for next certificate rotation Feb 25 07:21:38 crc kubenswrapper[4749]: I0225 07:21:38.822620 4749 generic.go:334] "Generic (PLEG): container finished" podID="04b76053-c9b9-40f2-ada0-ea7839916603" containerID="c95be46e1c2e1b39c53fb51ef7da1b307c26a97013e17681d0f12f5f1c005a62" exitCode=0 Feb 25 07:21:38 crc kubenswrapper[4749]: I0225 07:21:38.823069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04b76053-c9b9-40f2-ada0-ea7839916603","Type":"ContainerDied","Data":"c95be46e1c2e1b39c53fb51ef7da1b307c26a97013e17681d0f12f5f1c005a62"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.020067 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.020580 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" podUID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" containerName="controller-manager" containerID="cri-o://d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37" gracePeriod=30 Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.059204 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-20 17:36:05.128033396 +0000 UTC Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.059241 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6442h14m26.068794476s for next certificate rotation Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.126271 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.126514 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" podUID="53bf5621-fa8b-4221-b4d9-8f665d42f371" containerName="route-controller-manager" containerID="cri-o://7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b" gracePeriod=30 Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.270982 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.338911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flz6x\" (UniqueName: \"kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x\") pod \"2a8700de-dc40-4245-8c99-e792c342b5bb\" (UID: \"2a8700de-dc40-4245-8c99-e792c342b5bb\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.344716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x" (OuterVolumeSpecName: "kube-api-access-flz6x") pod "2a8700de-dc40-4245-8c99-e792c342b5bb" (UID: "2a8700de-dc40-4245-8c99-e792c342b5bb"). InnerVolumeSpecName "kube-api-access-flz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.441152 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flz6x\" (UniqueName: \"kubernetes.io/projected/2a8700de-dc40-4245-8c99-e792c342b5bb-kube-api-access-flz6x\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.456050 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.471207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.472293 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert\") pod \"53bf5621-fa8b-4221-b4d9-8f665d42f371\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert\") pod \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config\") pod \"53bf5621-fa8b-4221-b4d9-8f665d42f371\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles\") pod \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541954 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wql8f\" (UniqueName: \"kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f\") pod \"53bf5621-fa8b-4221-b4d9-8f665d42f371\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.541992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca\") pod \"53bf5621-fa8b-4221-b4d9-8f665d42f371\" (UID: \"53bf5621-fa8b-4221-b4d9-8f665d42f371\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.542734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0a09ebe4-c502-485e-9b10-d09ba48b9c75" (UID: "0a09ebe4-c502-485e-9b10-d09ba48b9c75"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.542815 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config" (OuterVolumeSpecName: "config") pod "53bf5621-fa8b-4221-b4d9-8f665d42f371" (UID: "53bf5621-fa8b-4221-b4d9-8f665d42f371"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.542841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca" (OuterVolumeSpecName: "client-ca") pod "53bf5621-fa8b-4221-b4d9-8f665d42f371" (UID: "53bf5621-fa8b-4221-b4d9-8f665d42f371"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.542943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config\") pod \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.543773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config" (OuterVolumeSpecName: "config") pod "0a09ebe4-c502-485e-9b10-d09ba48b9c75" (UID: "0a09ebe4-c502-485e-9b10-d09ba48b9c75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.544684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca\") pod \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.544761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2x8l\" (UniqueName: \"kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l\") pod \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\" (UID: \"0a09ebe4-c502-485e-9b10-d09ba48b9c75\") " Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545546 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545565 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545575 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545584 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bf5621-fa8b-4221-b4d9-8f665d42f371-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a09ebe4-c502-485e-9b10-d09ba48b9c75" (UID: "0a09ebe4-c502-485e-9b10-d09ba48b9c75"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.545837 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f" (OuterVolumeSpecName: "kube-api-access-wql8f") pod "53bf5621-fa8b-4221-b4d9-8f665d42f371" (UID: "53bf5621-fa8b-4221-b4d9-8f665d42f371"). InnerVolumeSpecName "kube-api-access-wql8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.546418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a09ebe4-c502-485e-9b10-d09ba48b9c75" (UID: "0a09ebe4-c502-485e-9b10-d09ba48b9c75"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.546832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53bf5621-fa8b-4221-b4d9-8f665d42f371" (UID: "53bf5621-fa8b-4221-b4d9-8f665d42f371"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.548222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l" (OuterVolumeSpecName: "kube-api-access-d2x8l") pod "0a09ebe4-c502-485e-9b10-d09ba48b9c75" (UID: "0a09ebe4-c502-485e-9b10-d09ba48b9c75"). InnerVolumeSpecName "kube-api-access-d2x8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.647410 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53bf5621-fa8b-4221-b4d9-8f665d42f371-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.647466 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a09ebe4-c502-485e-9b10-d09ba48b9c75-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.647486 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wql8f\" (UniqueName: \"kubernetes.io/projected/53bf5621-fa8b-4221-b4d9-8f665d42f371-kube-api-access-wql8f\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.647506 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a09ebe4-c502-485e-9b10-d09ba48b9c75-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.647523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2x8l\" (UniqueName: \"kubernetes.io/projected/0a09ebe4-c502-485e-9b10-d09ba48b9c75-kube-api-access-d2x8l\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.829489 4749 generic.go:334] "Generic (PLEG): container finished" podID="53bf5621-fa8b-4221-b4d9-8f665d42f371" containerID="7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b" exitCode=0 Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.829551 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.830541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" event={"ID":"53bf5621-fa8b-4221-b4d9-8f665d42f371","Type":"ContainerDied","Data":"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.830670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm" event={"ID":"53bf5621-fa8b-4221-b4d9-8f665d42f371","Type":"ContainerDied","Data":"a74d422f264a5f2c0fc01a586e6714b27fb93a6d4e9152ef2298cd470626f0e7"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.830705 4749 scope.go:117] "RemoveContainer" containerID="7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.831784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" event={"ID":"2a8700de-dc40-4245-8c99-e792c342b5bb","Type":"ContainerDied","Data":"ca92fcd2cee8b86501e97ca8080b59de90b813c81f422cc83fdb095f715ed219"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.831880 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca92fcd2cee8b86501e97ca8080b59de90b813c81f422cc83fdb095f715ed219" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.831836 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533400-6w2xx" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.833697 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" containerID="d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37" exitCode=0 Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.833757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.833807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" event={"ID":"0a09ebe4-c502-485e-9b10-d09ba48b9c75","Type":"ContainerDied","Data":"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.833835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-846bf578b7-l7qcj" event={"ID":"0a09ebe4-c502-485e-9b10-d09ba48b9c75","Type":"ContainerDied","Data":"aa6c8b7945524b6ace8584095b434f2e2224f9551c59121d67133cdfe556cd19"} Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.855412 4749 scope.go:117] "RemoveContainer" containerID="7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b" Feb 25 07:21:39 crc kubenswrapper[4749]: E0225 07:21:39.855881 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b\": container with ID starting with 7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b not found: ID does not exist" containerID="7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.855907 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b"} err="failed to get container status \"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b\": rpc error: code = NotFound desc = could not find container \"7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b\": container with ID starting with 7937a4be68b7fe63ecdb5b3bfe4a3993839ca98f36475f2a507274f89386d98b not found: ID does not exist" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.855927 4749 scope.go:117] "RemoveContainer" containerID="d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.858985 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.864217 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65fcd66b4-qppnm"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.870122 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.872350 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-846bf578b7-l7qcj"] Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.877420 4749 scope.go:117] "RemoveContainer" containerID="d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37" Feb 25 07:21:39 crc kubenswrapper[4749]: E0225 07:21:39.881608 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37\": container with ID starting with d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37 not found: ID does not exist" containerID="d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37" Feb 25 07:21:39 crc kubenswrapper[4749]: I0225 07:21:39.881658 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37"} err="failed to get container status \"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37\": rpc error: code = NotFound desc = could not find container \"d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37\": container with ID starting with d208fda98c244093c6d83eb9265f37214f4c1b7f54f46fb1c00a593cb189ac37 not found: ID does not exist" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.028834 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.053719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir\") pod \"04b76053-c9b9-40f2-ada0-ea7839916603\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.053814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "04b76053-c9b9-40f2-ada0-ea7839916603" (UID: "04b76053-c9b9-40f2-ada0-ea7839916603"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.053886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access\") pod \"04b76053-c9b9-40f2-ada0-ea7839916603\" (UID: \"04b76053-c9b9-40f2-ada0-ea7839916603\") " Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.054163 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04b76053-c9b9-40f2-ada0-ea7839916603-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.058880 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "04b76053-c9b9-40f2-ada0-ea7839916603" (UID: "04b76053-c9b9-40f2-ada0-ea7839916603"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.078061 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.078132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.144226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.144275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.155793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04b76053-c9b9-40f2-ada0-ea7839916603-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.203870 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.203963 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.435426 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.435486 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.491719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.642777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.642866 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.708429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.842787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04b76053-c9b9-40f2-ada0-ea7839916603","Type":"ContainerDied","Data":"0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c"} Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.842822 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.842825 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0668f5270754fbdc9f10ca1bc6c0d9baecff69eb2578be18f8536adcdf94981c" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.861858 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:21:40 crc kubenswrapper[4749]: E0225 07:21:40.862267 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" containerName="controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862307 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" containerName="controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: E0225 07:21:40.862331 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bf5621-fa8b-4221-b4d9-8f665d42f371" containerName="route-controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862345 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bf5621-fa8b-4221-b4d9-8f665d42f371" containerName="route-controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: E0225 07:21:40.862369 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b76053-c9b9-40f2-ada0-ea7839916603" containerName="pruner" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862381 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b76053-c9b9-40f2-ada0-ea7839916603" containerName="pruner" Feb 25 07:21:40 crc kubenswrapper[4749]: E0225 07:21:40.862411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" containerName="oc" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" containerName="oc" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862620 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bf5621-fa8b-4221-b4d9-8f665d42f371" containerName="route-controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862650 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" containerName="oc" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862676 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b76053-c9b9-40f2-ada0-ea7839916603" containerName="pruner" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.862691 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" containerName="controller-manager" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.863376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.864074 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.864734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.871783 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.872041 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.873001 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.873004 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.873934 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.874150 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.874183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.874351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.874923 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.875177 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.875401 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.879638 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.882169 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.885586 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.888974 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4pq4\" (UniqueName: \"kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6js7\" (UniqueName: \"kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:40 crc kubenswrapper[4749]: I0225 07:21:40.963517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.065159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.065210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6js7\" (UniqueName: \"kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.065248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.065296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.065326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.066766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.066858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.066955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.066983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4pq4\" (UniqueName: \"kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.067125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.067181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.068095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.069476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.069728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.078204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.079539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.081537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6js7\" (UniqueName: \"kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7\") pod \"controller-manager-b45fcccb-h8qbz\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.094190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4pq4\" (UniqueName: \"kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4\") pod \"route-controller-manager-84bc784f4f-xz4hl\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.194436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.215581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.334336 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a09ebe4-c502-485e-9b10-d09ba48b9c75" path="/var/lib/kubelet/pods/0a09ebe4-c502-485e-9b10-d09ba48b9c75/volumes" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.335060 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bf5621-fa8b-4221-b4d9-8f665d42f371" path="/var/lib/kubelet/pods/53bf5621-fa8b-4221-b4d9-8f665d42f371/volumes" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.407910 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:21:41 crc kubenswrapper[4749]: W0225 07:21:41.414332 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b9b68e3_291c_40f6_b59b_8a0c3a9a3e59.slice/crio-f452dae90a711398d2f16fdf2490097a0654b80dcbb2157b1a8bc5d92c1097c4 WatchSource:0}: Error finding container f452dae90a711398d2f16fdf2490097a0654b80dcbb2157b1a8bc5d92c1097c4: Status 404 returned error can't find the container with id f452dae90a711398d2f16fdf2490097a0654b80dcbb2157b1a8bc5d92c1097c4 Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.452269 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:21:41 crc kubenswrapper[4749]: W0225 07:21:41.459927 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode051ce6a_255e_4b63_93c4_5679a453341d.slice/crio-6b61876996c3fc4b4e400c0992ce3f7769cd9172031e6a87f47d4eb1d6e8da46 WatchSource:0}: Error finding container 6b61876996c3fc4b4e400c0992ce3f7769cd9172031e6a87f47d4eb1d6e8da46: Status 404 returned error can't find the container with id 6b61876996c3fc4b4e400c0992ce3f7769cd9172031e6a87f47d4eb1d6e8da46 Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.868708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" event={"ID":"e051ce6a-255e-4b63-93c4-5679a453341d","Type":"ContainerStarted","Data":"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680"} Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.869868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" event={"ID":"e051ce6a-255e-4b63-93c4-5679a453341d","Type":"ContainerStarted","Data":"6b61876996c3fc4b4e400c0992ce3f7769cd9172031e6a87f47d4eb1d6e8da46"} Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.869982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.871510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" event={"ID":"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59","Type":"ContainerStarted","Data":"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a"} Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.871654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" event={"ID":"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59","Type":"ContainerStarted","Data":"f452dae90a711398d2f16fdf2490097a0654b80dcbb2157b1a8bc5d92c1097c4"} Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.872045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.880981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.887576 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" podStartSLOduration=2.887559973 podStartE2EDuration="2.887559973s" podCreationTimestamp="2026-02-25 07:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:41.884062848 +0000 UTC m=+255.245888898" watchObservedRunningTime="2026-02-25 07:21:41.887559973 +0000 UTC m=+255.249385993" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.906858 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" podStartSLOduration=2.9068430210000002 podStartE2EDuration="2.906843021s" podCreationTimestamp="2026-02-25 07:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:41.903563112 +0000 UTC m=+255.265389132" watchObservedRunningTime="2026-02-25 07:21:41.906843021 +0000 UTC m=+255.268669031" Feb 25 07:21:41 crc kubenswrapper[4749]: I0225 07:21:41.937098 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:21:42 crc kubenswrapper[4749]: I0225 07:21:42.010705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:42 crc kubenswrapper[4749]: I0225 07:21:42.878156 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerID="e668762484a92d18135913cbefb9b239ba7b4887dd3d8b0c3c87c34fefc72b93" exitCode=0 Feb 25 07:21:42 crc kubenswrapper[4749]: I0225 07:21:42.878301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerDied","Data":"e668762484a92d18135913cbefb9b239ba7b4887dd3d8b0c3c87c34fefc72b93"} Feb 25 07:21:43 crc kubenswrapper[4749]: I0225 07:21:43.885236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerStarted","Data":"886b026975f2dac30b1a71c4801c3333e1705e479791593946741f718ca6b798"} Feb 25 07:21:43 crc kubenswrapper[4749]: I0225 07:21:43.900640 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x464k" podStartSLOduration=2.995405371 podStartE2EDuration="42.900625553s" podCreationTimestamp="2026-02-25 07:21:01 +0000 UTC" firstStartedPulling="2026-02-25 07:21:03.367057902 +0000 UTC m=+216.728883922" lastFinishedPulling="2026-02-25 07:21:43.272278084 +0000 UTC m=+256.634104104" observedRunningTime="2026-02-25 07:21:43.89978838 +0000 UTC m=+257.261614400" watchObservedRunningTime="2026-02-25 07:21:43.900625553 +0000 UTC m=+257.262451563" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.164931 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.165741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.168544 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.168544 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.178685 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.204822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.204917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.204936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.306655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.306917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.307032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.307151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.306797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.344326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access\") pod \"installer-9-crc\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.486419 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:21:44 crc kubenswrapper[4749]: I0225 07:21:44.950691 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 25 07:21:45 crc kubenswrapper[4749]: I0225 07:21:45.894859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"011732e4-f837-46cd-91c1-c05625b9b584","Type":"ContainerStarted","Data":"c357c2321d7ae9491991e96bc1e412dd213c5b5e8c147b263e6e278130c682ba"} Feb 25 07:21:45 crc kubenswrapper[4749]: I0225 07:21:45.895227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"011732e4-f837-46cd-91c1-c05625b9b584","Type":"ContainerStarted","Data":"52280efb24081360c2f10a6dc4c277ce04cc1b39973bd459c695a7f7bddf11c2"} Feb 25 07:21:45 crc kubenswrapper[4749]: I0225 07:21:45.911262 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.911244516 podStartE2EDuration="1.911244516s" podCreationTimestamp="2026-02-25 07:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:21:45.909055555 +0000 UTC m=+259.270881585" watchObservedRunningTime="2026-02-25 07:21:45.911244516 +0000 UTC m=+259.273070546" Feb 25 07:21:49 crc kubenswrapper[4749]: I0225 07:21:49.925428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerStarted","Data":"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2"} Feb 25 07:21:49 crc kubenswrapper[4749]: I0225 07:21:49.930066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerStarted","Data":"838ee78d0775f1f4d2216c79b3c3200876868c2f15c669dba019745044ffaaa4"} Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.130389 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.518069 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.699983 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.940520 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerID="838ee78d0775f1f4d2216c79b3c3200876868c2f15c669dba019745044ffaaa4" exitCode=0 Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.940647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerDied","Data":"838ee78d0775f1f4d2216c79b3c3200876868c2f15c669dba019745044ffaaa4"} Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.944720 4749 generic.go:334] "Generic (PLEG): container finished" podID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerID="ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2" exitCode=0 Feb 25 07:21:50 crc kubenswrapper[4749]: I0225 07:21:50.944766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerDied","Data":"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2"} Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.671864 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.672185 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.952718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerStarted","Data":"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714"} Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.955104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerStarted","Data":"5676e5c058c15e555a4a31919d1fda4117656d1fac0f10d679131957faa16003"} Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.957378 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerID="a72ce161885e729b31c394e17be6dcc914b30a04ed1fc31736ad35716fe23ed2" exitCode=0 Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.957403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerDied","Data":"a72ce161885e729b31c394e17be6dcc914b30a04ed1fc31736ad35716fe23ed2"} Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.977205 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-chmpx" podStartSLOduration=2.970830353 podStartE2EDuration="49.977184673s" podCreationTimestamp="2026-02-25 07:21:02 +0000 UTC" firstStartedPulling="2026-02-25 07:21:04.454231979 +0000 UTC m=+217.816057999" lastFinishedPulling="2026-02-25 07:21:51.460586289 +0000 UTC m=+264.822412319" observedRunningTime="2026-02-25 07:21:51.974173995 +0000 UTC m=+265.336000035" watchObservedRunningTime="2026-02-25 07:21:51.977184673 +0000 UTC m=+265.339010703" Feb 25 07:21:51 crc kubenswrapper[4749]: I0225 07:21:51.996180 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cpkr5" podStartSLOduration=2.959687873 podStartE2EDuration="49.996153216s" podCreationTimestamp="2026-02-25 07:21:02 +0000 UTC" firstStartedPulling="2026-02-25 07:21:04.40602434 +0000 UTC m=+217.767850350" lastFinishedPulling="2026-02-25 07:21:51.442489663 +0000 UTC m=+264.804315693" observedRunningTime="2026-02-25 07:21:51.99525915 +0000 UTC m=+265.357085180" watchObservedRunningTime="2026-02-25 07:21:51.996153216 +0000 UTC m=+265.357979266" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.202410 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.202499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.260498 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.404872 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.405123 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfddj" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="registry-server" containerID="cri-o://bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c" gracePeriod=2 Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.623978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.624327 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.867900 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.965236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerStarted","Data":"bdc368f7951f633f845f7d9178f5b3e679324ca0ab8a41fe8e6d7de4d8af80f0"} Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.968711 4749 generic.go:334] "Generic (PLEG): container finished" podID="fe08645e-d824-45ee-abb3-f6052d153605" containerID="bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c" exitCode=0 Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.969294 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfddj" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.969503 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerDied","Data":"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c"} Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.969537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfddj" event={"ID":"fe08645e-d824-45ee-abb3-f6052d153605","Type":"ContainerDied","Data":"5c71323c33ad714bd22b09ee4853e7ab8a55df96657ce873ab1ef8d68b98f9f8"} Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.969557 4749 scope.go:117] "RemoveContainer" containerID="bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.989798 4749 scope.go:117] "RemoveContainer" containerID="21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d" Feb 25 07:21:52 crc kubenswrapper[4749]: I0225 07:21:52.997878 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cqprr" podStartSLOduration=3.099426061 podStartE2EDuration="49.997859526s" podCreationTimestamp="2026-02-25 07:21:03 +0000 UTC" firstStartedPulling="2026-02-25 07:21:05.475673687 +0000 UTC m=+218.837499707" lastFinishedPulling="2026-02-25 07:21:52.374107152 +0000 UTC m=+265.735933172" observedRunningTime="2026-02-25 07:21:52.996567119 +0000 UTC m=+266.358393139" watchObservedRunningTime="2026-02-25 07:21:52.997859526 +0000 UTC m=+266.359685566" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.021938 4749 scope.go:117] "RemoveContainer" containerID="831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.026064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.041541 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content\") pod \"fe08645e-d824-45ee-abb3-f6052d153605\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.041629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjr4l\" (UniqueName: \"kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l\") pod \"fe08645e-d824-45ee-abb3-f6052d153605\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.041746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities\") pod \"fe08645e-d824-45ee-abb3-f6052d153605\" (UID: \"fe08645e-d824-45ee-abb3-f6052d153605\") " Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.042506 4749 scope.go:117] "RemoveContainer" containerID="bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c" Feb 25 07:21:53 crc kubenswrapper[4749]: E0225 07:21:53.043871 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c\": container with ID starting with bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c not found: ID does not exist" containerID="bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.043911 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c"} err="failed to get container status \"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c\": rpc error: code = NotFound desc = could not find container \"bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c\": container with ID starting with bd174d01972dd032fd08884fd67037b04124ef25eac14d83c1f869406cea881c not found: ID does not exist" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.043936 4749 scope.go:117] "RemoveContainer" containerID="21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d" Feb 25 07:21:53 crc kubenswrapper[4749]: E0225 07:21:53.044257 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d\": container with ID starting with 21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d not found: ID does not exist" containerID="21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.044275 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d"} err="failed to get container status \"21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d\": rpc error: code = NotFound desc = could not find container \"21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d\": container with ID starting with 21ebbf5d50a12d805475cc89ebcc9b4fa28b49925e97c36b7334b8e70667fa5d not found: ID does not exist" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.044289 4749 scope.go:117] "RemoveContainer" containerID="831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d" Feb 25 07:21:53 crc kubenswrapper[4749]: E0225 07:21:53.044562 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d\": container with ID starting with 831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d not found: ID does not exist" containerID="831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.044578 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d"} err="failed to get container status \"831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d\": rpc error: code = NotFound desc = could not find container \"831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d\": container with ID starting with 831ae453ed0fb428bbf7d5d5477c4a7d494e39ea01d7604a8392b82c0057e63d not found: ID does not exist" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.046270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities" (OuterVolumeSpecName: "utilities") pod "fe08645e-d824-45ee-abb3-f6052d153605" (UID: "fe08645e-d824-45ee-abb3-f6052d153605"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.051159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l" (OuterVolumeSpecName: "kube-api-access-fjr4l") pod "fe08645e-d824-45ee-abb3-f6052d153605" (UID: "fe08645e-d824-45ee-abb3-f6052d153605"). InnerVolumeSpecName "kube-api-access-fjr4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.102963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe08645e-d824-45ee-abb3-f6052d153605" (UID: "fe08645e-d824-45ee-abb3-f6052d153605"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.142890 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.142918 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjr4l\" (UniqueName: \"kubernetes.io/projected/fe08645e-d824-45ee-abb3-f6052d153605-kube-api-access-fjr4l\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.142928 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe08645e-d824-45ee-abb3-f6052d153605-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.201989 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.202288 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.302221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.307164 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfddj"] Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.328833 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe08645e-d824-45ee-abb3-f6052d153605" path="/var/lib/kubelet/pods/fe08645e-d824-45ee-abb3-f6052d153605/volumes" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.651702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.651739 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.671053 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-chmpx" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="registry-server" probeResult="failure" output=< Feb 25 07:21:53 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:21:53 crc kubenswrapper[4749]: > Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.802591 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.803265 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4mn2q" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="registry-server" containerID="cri-o://5d71727286b8f34049b1a2cca21d0779c393ee80e5f10e4acefb551f4d87db85" gracePeriod=2 Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.979344 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d303108-d189-4334-98e6-640b99c33faf" containerID="5d71727286b8f34049b1a2cca21d0779c393ee80e5f10e4acefb551f4d87db85" exitCode=0 Feb 25 07:21:53 crc kubenswrapper[4749]: I0225 07:21:53.979770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerDied","Data":"5d71727286b8f34049b1a2cca21d0779c393ee80e5f10e4acefb551f4d87db85"} Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.250527 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cpkr5" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="registry-server" probeResult="failure" output=< Feb 25 07:21:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:21:54 crc kubenswrapper[4749]: > Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.335964 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.456877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content\") pod \"0d303108-d189-4334-98e6-640b99c33faf\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.456965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities\") pod \"0d303108-d189-4334-98e6-640b99c33faf\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.457007 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5gg9\" (UniqueName: \"kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9\") pod \"0d303108-d189-4334-98e6-640b99c33faf\" (UID: \"0d303108-d189-4334-98e6-640b99c33faf\") " Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.457853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities" (OuterVolumeSpecName: "utilities") pod "0d303108-d189-4334-98e6-640b99c33faf" (UID: "0d303108-d189-4334-98e6-640b99c33faf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.461522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9" (OuterVolumeSpecName: "kube-api-access-x5gg9") pod "0d303108-d189-4334-98e6-640b99c33faf" (UID: "0d303108-d189-4334-98e6-640b99c33faf"). InnerVolumeSpecName "kube-api-access-x5gg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.506068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d303108-d189-4334-98e6-640b99c33faf" (UID: "0d303108-d189-4334-98e6-640b99c33faf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.558889 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.558932 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d303108-d189-4334-98e6-640b99c33faf-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.558945 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5gg9\" (UniqueName: \"kubernetes.io/projected/0d303108-d189-4334-98e6-640b99c33faf-kube-api-access-x5gg9\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.661427 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" podUID="76c21345-d376-4531-a431-faa389fc0623" containerName="oauth-openshift" containerID="cri-o://2209e480368b3aeec65634e662859894911cb4e758a2ec2392672aa5d57ffb84" gracePeriod=15 Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.719666 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cqprr" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="registry-server" probeResult="failure" output=< Feb 25 07:21:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:21:54 crc kubenswrapper[4749]: > Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.991037 4749 generic.go:334] "Generic (PLEG): container finished" podID="76c21345-d376-4531-a431-faa389fc0623" containerID="2209e480368b3aeec65634e662859894911cb4e758a2ec2392672aa5d57ffb84" exitCode=0 Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.991089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" event={"ID":"76c21345-d376-4531-a431-faa389fc0623","Type":"ContainerDied","Data":"2209e480368b3aeec65634e662859894911cb4e758a2ec2392672aa5d57ffb84"} Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.993670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mn2q" event={"ID":"0d303108-d189-4334-98e6-640b99c33faf","Type":"ContainerDied","Data":"0726c5d8c457e83f3eb5638e8c81ad719c6f0c46a70deb162d344c1d143f1fcf"} Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.993747 4749 scope.go:117] "RemoveContainer" containerID="5d71727286b8f34049b1a2cca21d0779c393ee80e5f10e4acefb551f4d87db85" Feb 25 07:21:54 crc kubenswrapper[4749]: I0225 07:21:54.993933 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mn2q" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.026564 4749 scope.go:117] "RemoveContainer" containerID="936b8789cf4393289c39de1f3946f1870a5854873f6137a60a609300caf62f2d" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.027647 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.033305 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4mn2q"] Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.051327 4749 scope.go:117] "RemoveContainer" containerID="d452703f077a4dc22f0d71e0301c8345dc777fd496006043e083a9307a5b398c" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.107893 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268314 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268708 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268863 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268950 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.268878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.269984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fnt5\" (UniqueName: \"kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5\") pod \"76c21345-d376-4531-a431-faa389fc0623\" (UID: \"76c21345-d376-4531-a431-faa389fc0623\") " Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270835 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270862 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76c21345-d376-4531-a431-faa389fc0623-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270881 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.270899 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.273397 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5" (OuterVolumeSpecName: "kube-api-access-5fnt5") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "kube-api-access-5fnt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.273445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.273754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.273840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.274223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.274402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.274643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.275319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.275970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "76c21345-d376-4531-a431-faa389fc0623" (UID: "76c21345-d376-4531-a431-faa389fc0623"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.331097 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d303108-d189-4334-98e6-640b99c33faf" path="/var/lib/kubelet/pods/0d303108-d189-4334-98e6-640b99c33faf/volumes" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372669 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372724 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372752 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372773 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372795 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372816 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372835 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76c21345-d376-4531-a431-faa389fc0623-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372854 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fnt5\" (UniqueName: \"kubernetes.io/projected/76c21345-d376-4531-a431-faa389fc0623-kube-api-access-5fnt5\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372874 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:55 crc kubenswrapper[4749]: I0225 07:21:55.372894 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/76c21345-d376-4531-a431-faa389fc0623-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:56 crc kubenswrapper[4749]: I0225 07:21:56.003742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" event={"ID":"76c21345-d376-4531-a431-faa389fc0623","Type":"ContainerDied","Data":"e7aeaf77f7da55fffe1535ff45d621a6c27dc81a02cf4d8e71959b08f7cb990f"} Feb 25 07:21:56 crc kubenswrapper[4749]: I0225 07:21:56.003840 4749 scope.go:117] "RemoveContainer" containerID="2209e480368b3aeec65634e662859894911cb4e758a2ec2392672aa5d57ffb84" Feb 25 07:21:56 crc kubenswrapper[4749]: I0225 07:21:56.003792 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cdjg8" Feb 25 07:21:56 crc kubenswrapper[4749]: I0225 07:21:56.038040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:21:56 crc kubenswrapper[4749]: I0225 07:21:56.046796 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cdjg8"] Feb 25 07:21:57 crc kubenswrapper[4749]: I0225 07:21:57.335507 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c21345-d376-4531-a431-faa389fc0623" path="/var/lib/kubelet/pods/76c21345-d376-4531-a431-faa389fc0623/volumes" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.012324 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.013116 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" podUID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" containerName="controller-manager" containerID="cri-o://a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a" gracePeriod=30 Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.031711 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.031969 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" podUID="e051ce6a-255e-4b63-93c4-5679a453341d" containerName="route-controller-manager" containerID="cri-o://7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680" gracePeriod=30 Feb 25 07:21:59 crc kubenswrapper[4749]: E0225 07:21:59.120721 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode051ce6a_255e_4b63_93c4_5679a453341d.slice/crio-7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b9b68e3_291c_40f6_b59b_8a0c3a9a3e59.slice/crio-a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode051ce6a_255e_4b63_93c4_5679a453341d.slice/crio-conmon-7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680.scope\": RecentStats: unable to find data in memory cache]" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.549492 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.558069 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config\") pod \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6js7\" (UniqueName: \"kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7\") pod \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4pq4\" (UniqueName: \"kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4\") pod \"e051ce6a-255e-4b63-93c4-5679a453341d\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config\") pod \"e051ce6a-255e-4b63-93c4-5679a453341d\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles\") pod \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639345 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca\") pod \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639360 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert\") pod \"e051ce6a-255e-4b63-93c4-5679a453341d\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert\") pod \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\" (UID: \"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.639400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca\") pod \"e051ce6a-255e-4b63-93c4-5679a453341d\" (UID: \"e051ce6a-255e-4b63-93c4-5679a453341d\") " Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.640244 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca" (OuterVolumeSpecName: "client-ca") pod "e051ce6a-255e-4b63-93c4-5679a453341d" (UID: "e051ce6a-255e-4b63-93c4-5679a453341d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.640797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config" (OuterVolumeSpecName: "config") pod "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" (UID: "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.641758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" (UID: "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.648773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" (UID: "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.648788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e051ce6a-255e-4b63-93c4-5679a453341d" (UID: "e051ce6a-255e-4b63-93c4-5679a453341d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.648789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7" (OuterVolumeSpecName: "kube-api-access-j6js7") pod "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" (UID: "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59"). InnerVolumeSpecName "kube-api-access-j6js7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.662819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4" (OuterVolumeSpecName: "kube-api-access-l4pq4") pod "e051ce6a-255e-4b63-93c4-5679a453341d" (UID: "e051ce6a-255e-4b63-93c4-5679a453341d"). InnerVolumeSpecName "kube-api-access-l4pq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.662843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" (UID: "4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.663176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config" (OuterVolumeSpecName: "config") pod "e051ce6a-255e-4b63-93c4-5679a453341d" (UID: "e051ce6a-255e-4b63-93c4-5679a453341d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741171 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741208 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741219 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741231 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6js7\" (UniqueName: \"kubernetes.io/projected/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-kube-api-access-j6js7\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741244 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4pq4\" (UniqueName: \"kubernetes.io/projected/e051ce6a-255e-4b63-93c4-5679a453341d-kube-api-access-l4pq4\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741255 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e051ce6a-255e-4b63-93c4-5679a453341d-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741267 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741279 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:21:59 crc kubenswrapper[4749]: I0225 07:21:59.741289 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e051ce6a-255e-4b63-93c4-5679a453341d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.036151 4749 generic.go:334] "Generic (PLEG): container finished" podID="e051ce6a-255e-4b63-93c4-5679a453341d" containerID="7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680" exitCode=0 Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.036271 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.036289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" event={"ID":"e051ce6a-255e-4b63-93c4-5679a453341d","Type":"ContainerDied","Data":"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680"} Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.036341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl" event={"ID":"e051ce6a-255e-4b63-93c4-5679a453341d","Type":"ContainerDied","Data":"6b61876996c3fc4b4e400c0992ce3f7769cd9172031e6a87f47d4eb1d6e8da46"} Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.036376 4749 scope.go:117] "RemoveContainer" containerID="7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.039556 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" containerID="a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a" exitCode=0 Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.039564 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.039588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" event={"ID":"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59","Type":"ContainerDied","Data":"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a"} Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.040887 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b45fcccb-h8qbz" event={"ID":"4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59","Type":"ContainerDied","Data":"f452dae90a711398d2f16fdf2490097a0654b80dcbb2157b1a8bc5d92c1097c4"} Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.063667 4749 scope.go:117] "RemoveContainer" containerID="7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.064421 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680\": container with ID starting with 7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680 not found: ID does not exist" containerID="7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.064482 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680"} err="failed to get container status \"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680\": rpc error: code = NotFound desc = could not find container \"7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680\": container with ID starting with 7b3c411ee1194cd89d0853842e4fd620c947dc301dd1c6ff78b8b4cbc6391680 not found: ID does not exist" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.064517 4749 scope.go:117] "RemoveContainer" containerID="a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.078542 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.083993 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84bc784f4f-xz4hl"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.104137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.107293 4749 scope.go:117] "RemoveContainer" containerID="a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.108535 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a\": container with ID starting with a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a not found: ID does not exist" containerID="a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.108674 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a"} err="failed to get container status \"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a\": rpc error: code = NotFound desc = could not find container \"a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a\": container with ID starting with a173e845ba932ff48429c8f7ca2bae4ffeaa8a87577ac381e4a7a2ef9935b22a not found: ID does not exist" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.109352 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b45fcccb-h8qbz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.146710 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533402-x8rfc"] Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149175 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149223 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="extract-utilities" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="extract-utilities" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149246 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="extract-utilities" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149255 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="extract-utilities" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149300 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c21345-d376-4531-a431-faa389fc0623" containerName="oauth-openshift" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149310 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c21345-d376-4531-a431-faa389fc0623" containerName="oauth-openshift" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149319 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="extract-content" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149326 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="extract-content" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149336 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e051ce6a-255e-4b63-93c4-5679a453341d" containerName="route-controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149344 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e051ce6a-255e-4b63-93c4-5679a453341d" containerName="route-controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149353 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" containerName="controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149395 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" containerName="controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="extract-content" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149421 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="extract-content" Feb 25 07:22:00 crc kubenswrapper[4749]: E0225 07:22:00.149432 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149439 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149677 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c21345-d376-4531-a431-faa389fc0623" containerName="oauth-openshift" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149732 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe08645e-d824-45ee-abb3-f6052d153605" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149748 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d303108-d189-4334-98e6-640b99c33faf" containerName="registry-server" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149760 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" containerName="controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.149804 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e051ce6a-255e-4b63-93c4-5679a453341d" containerName="route-controller-manager" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.150340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.154755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.156029 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.159458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533402-x8rfc"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.161053 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.245838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9lmh\" (UniqueName: \"kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh\") pod \"auto-csr-approver-29533402-x8rfc\" (UID: \"58bbf72f-7a88-435b-923c-9b56dfd488c5\") " pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.347318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9lmh\" (UniqueName: \"kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh\") pod \"auto-csr-approver-29533402-x8rfc\" (UID: \"58bbf72f-7a88-435b-923c-9b56dfd488c5\") " pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.376785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9lmh\" (UniqueName: \"kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh\") pod \"auto-csr-approver-29533402-x8rfc\" (UID: \"58bbf72f-7a88-435b-923c-9b56dfd488c5\") " pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.481800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.869167 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533402-x8rfc"] Feb 25 07:22:00 crc kubenswrapper[4749]: W0225 07:22:00.879781 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58bbf72f_7a88_435b_923c_9b56dfd488c5.slice/crio-8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c WatchSource:0}: Error finding container 8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c: Status 404 returned error can't find the container with id 8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.884980 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.886549 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.893993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.894285 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.894404 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.896903 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.906848 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.906933 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.911925 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.912477 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.917038 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.918315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.919666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.920232 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.922056 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.922302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.922506 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.923942 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.927688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.954471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.954761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6dh\" (UniqueName: \"kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.955815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.955893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.955942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8q8q\" (UniqueName: \"kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.956005 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.956046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.956109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:00 crc kubenswrapper[4749]: I0225 07:22:00.956133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.047360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" event={"ID":"58bbf72f-7a88-435b-923c-9b56dfd488c5","Type":"ContainerStarted","Data":"8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c"} Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6dh\" (UniqueName: \"kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8q8q\" (UniqueName: \"kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.057985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.058816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.059022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.060068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.060453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.060921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.064929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.065283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.077526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6dh\" (UniqueName: \"kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh\") pod \"controller-manager-6866dc749-wg5hz\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.078630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8q8q\" (UniqueName: \"kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q\") pod \"route-controller-manager-7cb866cfb6-2nlgz\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.244468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.258193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.339062 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59" path="/var/lib/kubelet/pods/4b9b68e3-291c-40f6-b59b-8a0c3a9a3e59/volumes" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.340523 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e051ce6a-255e-4b63-93c4-5679a453341d" path="/var/lib/kubelet/pods/e051ce6a-255e-4b63-93c4-5679a453341d/volumes" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.594407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.737748 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:01 crc kubenswrapper[4749]: W0225 07:22:01.753458 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab473c1_2a0a_4745_9c1c_4b8290fa1e84.slice/crio-1c868e50147e4db9731fb3aef7003ab7babd848ad8537a65be19e60fae61fa0e WatchSource:0}: Error finding container 1c868e50147e4db9731fb3aef7003ab7babd848ad8537a65be19e60fae61fa0e: Status 404 returned error can't find the container with id 1c868e50147e4db9731fb3aef7003ab7babd848ad8537a65be19e60fae61fa0e Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.880390 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r"] Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.881218 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.884961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.887501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.887872 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.887935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.888568 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.888659 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.888876 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.888991 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.889204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.889440 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.890011 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.891672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.901887 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.904500 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.909971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r"] Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.914524 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.972766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.972863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.972918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.972956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.972999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973251 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bkhs\" (UniqueName: \"kubernetes.io/projected/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-kube-api-access-2bkhs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:01 crc kubenswrapper[4749]: I0225 07:22:01.973370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.057428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" event={"ID":"305d5ffb-b0b2-4521-8025-f305c6d3e7c8","Type":"ContainerStarted","Data":"1ee5e70f57798f6054820da75cf37935b2cfc744cbecfddfec18f0c3126a2d37"} Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.059520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" event={"ID":"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84","Type":"ContainerStarted","Data":"1c868e50147e4db9731fb3aef7003ab7babd848ad8537a65be19e60fae61fa0e"} Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.074987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bkhs\" (UniqueName: \"kubernetes.io/projected/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-kube-api-access-2bkhs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.075280 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.076585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.076669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.077398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.077536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.077832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.082742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.083212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.083486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.083619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.084115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.084342 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.087034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.087389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.107454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bkhs\" (UniqueName: \"kubernetes.io/projected/8be88b19-ec1a-40ef-a24a-eaf46186a2d7-kube-api-access-2bkhs\") pod \"oauth-openshift-5d4b6f47b4-ts77r\" (UID: \"8be88b19-ec1a-40ef-a24a-eaf46186a2d7\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.211571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.693227 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.751509 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:22:02 crc kubenswrapper[4749]: I0225 07:22:02.771924 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r"] Feb 25 07:22:02 crc kubenswrapper[4749]: W0225 07:22:02.776777 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8be88b19_ec1a_40ef_a24a_eaf46186a2d7.slice/crio-ddcf7aaea78cf57453a9741090c94130f98e71f02f1a675f9d78d28723f38242 WatchSource:0}: Error finding container ddcf7aaea78cf57453a9741090c94130f98e71f02f1a675f9d78d28723f38242: Status 404 returned error can't find the container with id ddcf7aaea78cf57453a9741090c94130f98e71f02f1a675f9d78d28723f38242 Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.067630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" event={"ID":"305d5ffb-b0b2-4521-8025-f305c6d3e7c8","Type":"ContainerStarted","Data":"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff"} Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.068003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.071074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" event={"ID":"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84","Type":"ContainerStarted","Data":"357eac0aa48a6ad5f461c12f89586f9968e6c87253283451e4043730f5ac59c9"} Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.072349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" event={"ID":"8be88b19-ec1a-40ef-a24a-eaf46186a2d7","Type":"ContainerStarted","Data":"ddcf7aaea78cf57453a9741090c94130f98e71f02f1a675f9d78d28723f38242"} Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.089584 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" podStartSLOduration=4.08956101 podStartE2EDuration="4.08956101s" podCreationTimestamp="2026-02-25 07:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:03.088656673 +0000 UTC m=+276.450482693" watchObservedRunningTime="2026-02-25 07:22:03.08956101 +0000 UTC m=+276.451387030" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.245481 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.295419 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.385897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.398856 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.709783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:22:03 crc kubenswrapper[4749]: I0225 07:22:03.760721 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.087288 4749 generic.go:334] "Generic (PLEG): container finished" podID="58bbf72f-7a88-435b-923c-9b56dfd488c5" containerID="bdb579912f9163bb51d8cd8835dc5eef67975260bf9a8fa3dba0fa89d9c819c5" exitCode=0 Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.087340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" event={"ID":"58bbf72f-7a88-435b-923c-9b56dfd488c5","Type":"ContainerDied","Data":"bdb579912f9163bb51d8cd8835dc5eef67975260bf9a8fa3dba0fa89d9c819c5"} Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.092422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" event={"ID":"8be88b19-ec1a-40ef-a24a-eaf46186a2d7","Type":"ContainerStarted","Data":"8af3f2a21d369eff034639973b005d6c9b0f86f5441b97284c3db82eccc4c1e1"} Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.092485 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.093727 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-chmpx" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="registry-server" containerID="cri-o://bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714" gracePeriod=2 Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.095132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.101398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.101717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.165260 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" podStartSLOduration=5.165233284 podStartE2EDuration="5.165233284s" podCreationTimestamp="2026-02-25 07:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:04.161709462 +0000 UTC m=+277.523535532" watchObservedRunningTime="2026-02-25 07:22:04.165233284 +0000 UTC m=+277.527059344" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.183113 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-ts77r" podStartSLOduration=35.183089914 podStartE2EDuration="35.183089914s" podCreationTimestamp="2026-02-25 07:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:04.135434247 +0000 UTC m=+277.497260317" watchObservedRunningTime="2026-02-25 07:22:04.183089914 +0000 UTC m=+277.544915934" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.499995 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.611932 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content\") pod \"c8a4c468-f347-4632-a517-c0e9bb839e98\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.612174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxkw2\" (UniqueName: \"kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2\") pod \"c8a4c468-f347-4632-a517-c0e9bb839e98\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.612256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities\") pod \"c8a4c468-f347-4632-a517-c0e9bb839e98\" (UID: \"c8a4c468-f347-4632-a517-c0e9bb839e98\") " Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.613917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities" (OuterVolumeSpecName: "utilities") pod "c8a4c468-f347-4632-a517-c0e9bb839e98" (UID: "c8a4c468-f347-4632-a517-c0e9bb839e98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.614700 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.620779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2" (OuterVolumeSpecName: "kube-api-access-mxkw2") pod "c8a4c468-f347-4632-a517-c0e9bb839e98" (UID: "c8a4c468-f347-4632-a517-c0e9bb839e98"). InnerVolumeSpecName "kube-api-access-mxkw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.642036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8a4c468-f347-4632-a517-c0e9bb839e98" (UID: "c8a4c468-f347-4632-a517-c0e9bb839e98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.715655 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a4c468-f347-4632-a517-c0e9bb839e98-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:04 crc kubenswrapper[4749]: I0225 07:22:04.715710 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxkw2\" (UniqueName: \"kubernetes.io/projected/c8a4c468-f347-4632-a517-c0e9bb839e98-kube-api-access-mxkw2\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.108967 4749 generic.go:334] "Generic (PLEG): container finished" podID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerID="bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714" exitCode=0 Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.109035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerDied","Data":"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714"} Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.109102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-chmpx" event={"ID":"c8a4c468-f347-4632-a517-c0e9bb839e98","Type":"ContainerDied","Data":"d754c34bcd105fe4c62b69d393aa04019f97e8653ba6d8f861dde270f461260a"} Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.109128 4749 scope.go:117] "RemoveContainer" containerID="bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.109220 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-chmpx" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.151352 4749 scope.go:117] "RemoveContainer" containerID="ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.158884 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.165348 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-chmpx"] Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.175132 4749 scope.go:117] "RemoveContainer" containerID="a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.209003 4749 scope.go:117] "RemoveContainer" containerID="bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714" Feb 25 07:22:05 crc kubenswrapper[4749]: E0225 07:22:05.209730 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714\": container with ID starting with bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714 not found: ID does not exist" containerID="bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.209933 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714"} err="failed to get container status \"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714\": rpc error: code = NotFound desc = could not find container \"bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714\": container with ID starting with bd0d4602ec134a9ce18e342d68206a1e0c7aac2007915b5b9da2b1b6cf962714 not found: ID does not exist" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.210148 4749 scope.go:117] "RemoveContainer" containerID="ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2" Feb 25 07:22:05 crc kubenswrapper[4749]: E0225 07:22:05.210934 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2\": container with ID starting with ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2 not found: ID does not exist" containerID="ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.210991 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2"} err="failed to get container status \"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2\": rpc error: code = NotFound desc = could not find container \"ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2\": container with ID starting with ea073194b390e02e833a0762038d16de941bf2751fdbb0c4bb75167ff0010ca2 not found: ID does not exist" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.211029 4749 scope.go:117] "RemoveContainer" containerID="a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17" Feb 25 07:22:05 crc kubenswrapper[4749]: E0225 07:22:05.211614 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17\": container with ID starting with a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17 not found: ID does not exist" containerID="a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.211647 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17"} err="failed to get container status \"a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17\": rpc error: code = NotFound desc = could not find container \"a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17\": container with ID starting with a2d4e1ac695f92b141f040a7f2b5d8c7a1ec78046d6ec10ac5374d146b91cc17 not found: ID does not exist" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.333524 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" path="/var/lib/kubelet/pods/c8a4c468-f347-4632-a517-c0e9bb839e98/volumes" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.487769 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.527579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9lmh\" (UniqueName: \"kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh\") pod \"58bbf72f-7a88-435b-923c-9b56dfd488c5\" (UID: \"58bbf72f-7a88-435b-923c-9b56dfd488c5\") " Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.532551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh" (OuterVolumeSpecName: "kube-api-access-z9lmh") pod "58bbf72f-7a88-435b-923c-9b56dfd488c5" (UID: "58bbf72f-7a88-435b-923c-9b56dfd488c5"). InnerVolumeSpecName "kube-api-access-z9lmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.609466 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.610082 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cqprr" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="registry-server" containerID="cri-o://bdc368f7951f633f845f7d9178f5b3e679324ca0ab8a41fe8e6d7de4d8af80f0" gracePeriod=2 Feb 25 07:22:05 crc kubenswrapper[4749]: I0225 07:22:05.630281 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9lmh\" (UniqueName: \"kubernetes.io/projected/58bbf72f-7a88-435b-923c-9b56dfd488c5-kube-api-access-z9lmh\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.116946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" event={"ID":"58bbf72f-7a88-435b-923c-9b56dfd488c5","Type":"ContainerDied","Data":"8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c"} Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.117270 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f4172a791f049a429e52fa332a240bd6ab32fe5cdeb5da5cfb6efdc2e2bd58c" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.117041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533402-x8rfc" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.121452 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerID="bdc368f7951f633f845f7d9178f5b3e679324ca0ab8a41fe8e6d7de4d8af80f0" exitCode=0 Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.122314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerDied","Data":"bdc368f7951f633f845f7d9178f5b3e679324ca0ab8a41fe8e6d7de4d8af80f0"} Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.122380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cqprr" event={"ID":"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774","Type":"ContainerDied","Data":"7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9"} Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.122404 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f61939f9d1fe760c39d03ebe9873692098975307086090c2293e058ca3b81c9" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.150102 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.239232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5t6c\" (UniqueName: \"kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c\") pod \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.239583 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities\") pod \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.239667 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content\") pod \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\" (UID: \"8f62d900-f9e9-462b-9dc1-4cfd3eaeb774\") " Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.242166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities" (OuterVolumeSpecName: "utilities") pod "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" (UID: "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.243522 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.257632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c" (OuterVolumeSpecName: "kube-api-access-g5t6c") pod "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" (UID: "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774"). InnerVolumeSpecName "kube-api-access-g5t6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.344644 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5t6c\" (UniqueName: \"kubernetes.io/projected/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-kube-api-access-g5t6c\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.358304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" (UID: "8f62d900-f9e9-462b-9dc1-4cfd3eaeb774"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:22:06 crc kubenswrapper[4749]: I0225 07:22:06.446322 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:07 crc kubenswrapper[4749]: I0225 07:22:07.138739 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cqprr" Feb 25 07:22:07 crc kubenswrapper[4749]: I0225 07:22:07.185574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:22:07 crc kubenswrapper[4749]: I0225 07:22:07.195563 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cqprr"] Feb 25 07:22:07 crc kubenswrapper[4749]: I0225 07:22:07.330165 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" path="/var/lib/kubelet/pods/8f62d900-f9e9-462b-9dc1-4cfd3eaeb774/volumes" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.035016 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.035770 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" podUID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" containerName="controller-manager" containerID="cri-o://357eac0aa48a6ad5f461c12f89586f9968e6c87253283451e4043730f5ac59c9" gracePeriod=30 Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.127932 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.128319 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" podUID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" containerName="route-controller-manager" containerID="cri-o://ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff" gracePeriod=30 Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.231746 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" containerID="357eac0aa48a6ad5f461c12f89586f9968e6c87253283451e4043730f5ac59c9" exitCode=0 Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.231798 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" event={"ID":"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84","Type":"ContainerDied","Data":"357eac0aa48a6ad5f461c12f89586f9968e6c87253283451e4043730f5ac59c9"} Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.591269 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.639965 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.646078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert\") pod \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.646158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config\") pod \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.646204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8q8q\" (UniqueName: \"kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q\") pod \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.646247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca\") pod \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\" (UID: \"305d5ffb-b0b2-4521-8025-f305c6d3e7c8\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.647174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "305d5ffb-b0b2-4521-8025-f305c6d3e7c8" (UID: "305d5ffb-b0b2-4521-8025-f305c6d3e7c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.647206 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config" (OuterVolumeSpecName: "config") pod "305d5ffb-b0b2-4521-8025-f305c6d3e7c8" (UID: "305d5ffb-b0b2-4521-8025-f305c6d3e7c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.651725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "305d5ffb-b0b2-4521-8025-f305c6d3e7c8" (UID: "305d5ffb-b0b2-4521-8025-f305c6d3e7c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.651984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q" (OuterVolumeSpecName: "kube-api-access-q8q8q") pod "305d5ffb-b0b2-4521-8025-f305c6d3e7c8" (UID: "305d5ffb-b0b2-4521-8025-f305c6d3e7c8"). InnerVolumeSpecName "kube-api-access-q8q8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.747676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert\") pod \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.747737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles\") pod \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.748573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" (UID: "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.748679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config\") pod \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca" (OuterVolumeSpecName: "client-ca") pod "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" (UID: "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config" (OuterVolumeSpecName: "config") pod "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" (UID: "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.748715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca\") pod \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll6dh\" (UniqueName: \"kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh\") pod \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\" (UID: \"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84\") " Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749698 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8q8q\" (UniqueName: \"kubernetes.io/projected/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-kube-api-access-q8q8q\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749716 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749727 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749736 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749743 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749751 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.749759 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/305d5ffb-b0b2-4521-8025-f305c6d3e7c8-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.753720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" (UID: "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.753775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh" (OuterVolumeSpecName: "kube-api-access-ll6dh") pod "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" (UID: "0ab473c1-2a0a-4745-9c1c-4b8290fa1e84"). InnerVolumeSpecName "kube-api-access-ll6dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.850793 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:19 crc kubenswrapper[4749]: I0225 07:22:19.850823 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll6dh\" (UniqueName: \"kubernetes.io/projected/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84-kube-api-access-ll6dh\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.237869 4749 generic.go:334] "Generic (PLEG): container finished" podID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" containerID="ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff" exitCode=0 Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.237929 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.237960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" event={"ID":"305d5ffb-b0b2-4521-8025-f305c6d3e7c8","Type":"ContainerDied","Data":"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff"} Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.238002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz" event={"ID":"305d5ffb-b0b2-4521-8025-f305c6d3e7c8","Type":"ContainerDied","Data":"1ee5e70f57798f6054820da75cf37935b2cfc744cbecfddfec18f0c3126a2d37"} Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.238029 4749 scope.go:117] "RemoveContainer" containerID="ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.245026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" event={"ID":"0ab473c1-2a0a-4745-9c1c-4b8290fa1e84","Type":"ContainerDied","Data":"1c868e50147e4db9731fb3aef7003ab7babd848ad8537a65be19e60fae61fa0e"} Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.245105 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6866dc749-wg5hz" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.260510 4749 scope.go:117] "RemoveContainer" containerID="ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.260874 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff\": container with ID starting with ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff not found: ID does not exist" containerID="ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.260915 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff"} err="failed to get container status \"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff\": rpc error: code = NotFound desc = could not find container \"ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff\": container with ID starting with ebc8eeabd49763dc7f0bca434888be9e9c0fc524c67c7330efdf509c7bb631ff not found: ID does not exist" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.260937 4749 scope.go:117] "RemoveContainer" containerID="357eac0aa48a6ad5f461c12f89586f9968e6c87253283451e4043730f5ac59c9" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.267498 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.270275 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cb866cfb6-2nlgz"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.289433 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.294914 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6866dc749-wg5hz"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.896884 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f9996679-fwkdv"] Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897209 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" containerName="route-controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897238 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" containerName="route-controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897260 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897273 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897292 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="extract-content" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897305 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="extract-content" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897323 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="extract-content" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897335 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="extract-content" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897353 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897365 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897381 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58bbf72f-7a88-435b-923c-9b56dfd488c5" containerName="oc" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897394 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bbf72f-7a88-435b-923c-9b56dfd488c5" containerName="oc" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897414 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" containerName="controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897427 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" containerName="controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897445 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="extract-utilities" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897457 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="extract-utilities" Feb 25 07:22:20 crc kubenswrapper[4749]: E0225 07:22:20.897479 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="extract-utilities" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="extract-utilities" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897675 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" containerName="controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897691 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58bbf72f-7a88-435b-923c-9b56dfd488c5" containerName="oc" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897713 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" containerName="route-controller-manager" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897731 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a4c468-f347-4632-a517-c0e9bb839e98" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.897757 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f62d900-f9e9-462b-9dc1-4cfd3eaeb774" containerName="registry-server" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.898341 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.901013 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.901455 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.901553 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.901676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.901868 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.902030 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.905029 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.905217 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.908460 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.908672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.908861 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.908931 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.908999 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.909145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.913537 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.914880 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.933007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9996679-fwkdv"] Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.963819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-client-ca\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964005 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-config\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kst8\" (UniqueName: \"kubernetes.io/projected/6322a53d-b0cd-4d85-b50e-6d52d235c98b-kube-api-access-5kst8\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-client-ca\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-config\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6322a53d-b0cd-4d85-b50e-6d52d235c98b-serving-cert\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c04e4a6-42e8-4c73-baad-69d1d6108595-serving-cert\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-proxy-ca-bundles\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:20 crc kubenswrapper[4749]: I0225 07:22:20.964257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnzz\" (UniqueName: \"kubernetes.io/projected/3c04e4a6-42e8-4c73-baad-69d1d6108595-kube-api-access-kwnzz\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.065861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kst8\" (UniqueName: \"kubernetes.io/projected/6322a53d-b0cd-4d85-b50e-6d52d235c98b-kube-api-access-5kst8\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.065943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-client-ca\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.065997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-config\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6322a53d-b0cd-4d85-b50e-6d52d235c98b-serving-cert\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c04e4a6-42e8-4c73-baad-69d1d6108595-serving-cert\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-proxy-ca-bundles\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnzz\" (UniqueName: \"kubernetes.io/projected/3c04e4a6-42e8-4c73-baad-69d1d6108595-kube-api-access-kwnzz\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-client-ca\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.066308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-config\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.067039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-client-ca\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.067268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-client-ca\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.067627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c04e4a6-42e8-4c73-baad-69d1d6108595-config\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.067800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-proxy-ca-bundles\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.068510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6322a53d-b0cd-4d85-b50e-6d52d235c98b-config\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.069811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c04e4a6-42e8-4c73-baad-69d1d6108595-serving-cert\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.078690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6322a53d-b0cd-4d85-b50e-6d52d235c98b-serving-cert\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.088867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kst8\" (UniqueName: \"kubernetes.io/projected/6322a53d-b0cd-4d85-b50e-6d52d235c98b-kube-api-access-5kst8\") pod \"controller-manager-7f9996679-fwkdv\" (UID: \"6322a53d-b0cd-4d85-b50e-6d52d235c98b\") " pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.090049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnzz\" (UniqueName: \"kubernetes.io/projected/3c04e4a6-42e8-4c73-baad-69d1d6108595-kube-api-access-kwnzz\") pod \"route-controller-manager-597f9ff58f-jxm68\" (UID: \"3c04e4a6-42e8-4c73-baad-69d1d6108595\") " pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.227768 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.245005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.330880 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab473c1-2a0a-4745-9c1c-4b8290fa1e84" path="/var/lib/kubelet/pods/0ab473c1-2a0a-4745-9c1c-4b8290fa1e84/volumes" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.331511 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305d5ffb-b0b2-4521-8025-f305c6d3e7c8" path="/var/lib/kubelet/pods/305d5ffb-b0b2-4521-8025-f305c6d3e7c8/volumes" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.487426 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68"] Feb 25 07:22:21 crc kubenswrapper[4749]: W0225 07:22:21.492039 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c04e4a6_42e8_4c73_baad_69d1d6108595.slice/crio-7604d68a8461f70b8cb4d671d5f6c0e880b3920b4b14d2a370a6ae7d70ca5174 WatchSource:0}: Error finding container 7604d68a8461f70b8cb4d671d5f6c0e880b3920b4b14d2a370a6ae7d70ca5174: Status 404 returned error can't find the container with id 7604d68a8461f70b8cb4d671d5f6c0e880b3920b4b14d2a370a6ae7d70ca5174 Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.671935 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.672262 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.672307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.672908 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.673000 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71" gracePeriod=600 Feb 25 07:22:21 crc kubenswrapper[4749]: I0225 07:22:21.733210 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f9996679-fwkdv"] Feb 25 07:22:21 crc kubenswrapper[4749]: W0225 07:22:21.738005 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6322a53d_b0cd_4d85_b50e_6d52d235c98b.slice/crio-b016c96fbd57803cc28ddd1beaf79ad54253a41d323056eef4d1cb2f6824c94c WatchSource:0}: Error finding container b016c96fbd57803cc28ddd1beaf79ad54253a41d323056eef4d1cb2f6824c94c: Status 404 returned error can't find the container with id b016c96fbd57803cc28ddd1beaf79ad54253a41d323056eef4d1cb2f6824c94c Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.261334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" event={"ID":"3c04e4a6-42e8-4c73-baad-69d1d6108595","Type":"ContainerStarted","Data":"ec5f4dc6ceff9f098092a8348f579fb8a5b48045b58fd74c361771735d76c188"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.261726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" event={"ID":"3c04e4a6-42e8-4c73-baad-69d1d6108595","Type":"ContainerStarted","Data":"7604d68a8461f70b8cb4d671d5f6c0e880b3920b4b14d2a370a6ae7d70ca5174"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.263038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.269828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" event={"ID":"6322a53d-b0cd-4d85-b50e-6d52d235c98b","Type":"ContainerStarted","Data":"478b1470d262899673536b5c8cad012754c9804f0dd158f183ffe639e8acd21f"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.269873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" event={"ID":"6322a53d-b0cd-4d85-b50e-6d52d235c98b","Type":"ContainerStarted","Data":"b016c96fbd57803cc28ddd1beaf79ad54253a41d323056eef4d1cb2f6824c94c"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.270714 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.273259 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.274865 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71" exitCode=0 Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.274901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.274924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905"} Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.276443 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.280495 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-597f9ff58f-jxm68" podStartSLOduration=3.280479742 podStartE2EDuration="3.280479742s" podCreationTimestamp="2026-02-25 07:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:22.279120863 +0000 UTC m=+295.640946893" watchObservedRunningTime="2026-02-25 07:22:22.280479742 +0000 UTC m=+295.642305762" Feb 25 07:22:22 crc kubenswrapper[4749]: I0225 07:22:22.325243 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f9996679-fwkdv" podStartSLOduration=3.325222895 podStartE2EDuration="3.325222895s" podCreationTimestamp="2026-02-25 07:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:22.299865297 +0000 UTC m=+295.661691387" watchObservedRunningTime="2026-02-25 07:22:22.325222895 +0000 UTC m=+295.687048915" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.509055 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.510711 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.510958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.511092 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1" gracePeriod=15 Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.511278 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864" gracePeriod=15 Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.511298 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6" gracePeriod=15 Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.511395 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a" gracePeriod=15 Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.511359 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540" gracePeriod=15 Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512111 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512409 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512448 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512477 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512488 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512496 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512504 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512517 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512525 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512537 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512544 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512551 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512558 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512569 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512576 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512585 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512637 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512644 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512788 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512804 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512815 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512826 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512835 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512844 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512856 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512865 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.512985 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.512997 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.513174 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.565305 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607254 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.607312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.708307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.708569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.708710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.708585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.708899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.709664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: I0225 07:22:23.864814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:22:23 crc kubenswrapper[4749]: W0225 07:22:23.893026 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-403bfaa589aad2cf2129738838ce8068f4f9685d26376ee24798e95e5e2111be WatchSource:0}: Error finding container 403bfaa589aad2cf2129738838ce8068f4f9685d26376ee24798e95e5e2111be: Status 404 returned error can't find the container with id 403bfaa589aad2cf2129738838ce8068f4f9685d26376ee24798e95e5e2111be Feb 25 07:22:23 crc kubenswrapper[4749]: E0225 07:22:23.898691 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18976c6161848aad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,LastTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.292552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3"} Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.292967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"403bfaa589aad2cf2129738838ce8068f4f9685d26376ee24798e95e5e2111be"} Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.293219 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.293634 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.296258 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.298141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.299709 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540" exitCode=0 Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.299738 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a" exitCode=0 Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.299752 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6" exitCode=0 Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.299760 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864" exitCode=2 Feb 25 07:22:24 crc kubenswrapper[4749]: I0225 07:22:24.299779 4749 scope.go:117] "RemoveContainer" containerID="b21abe3d174a50f89bcaa2d146d3cfd4bf55a95167eb67148e62313bb6a7ad75" Feb 25 07:22:25 crc kubenswrapper[4749]: E0225 07:22:25.259398 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18976c6161848aad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,LastTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:22:25 crc kubenswrapper[4749]: I0225 07:22:25.313316 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.028578 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.029922 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.030651 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.031322 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.151534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.151982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.151747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152659 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152684 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.152714 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.330123 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.331412 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1" exitCode=0 Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.331500 4749 scope.go:117] "RemoveContainer" containerID="a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.331537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.358463 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.358665 4749 scope.go:117] "RemoveContainer" containerID="e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.359084 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.385741 4749 scope.go:117] "RemoveContainer" containerID="0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.410796 4749 scope.go:117] "RemoveContainer" containerID="346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.446992 4749 scope.go:117] "RemoveContainer" containerID="a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.476178 4749 scope.go:117] "RemoveContainer" containerID="1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.511676 4749 scope.go:117] "RemoveContainer" containerID="a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.512170 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\": container with ID starting with a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540 not found: ID does not exist" containerID="a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.512197 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540"} err="failed to get container status \"a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\": rpc error: code = NotFound desc = could not find container \"a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540\": container with ID starting with a1f5ce521c860143b7d99e77387601b62affe8b518a4417c586a48ba5237b540 not found: ID does not exist" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.512234 4749 scope.go:117] "RemoveContainer" containerID="e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.513107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\": container with ID starting with e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a not found: ID does not exist" containerID="e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.513129 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a"} err="failed to get container status \"e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\": rpc error: code = NotFound desc = could not find container \"e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a\": container with ID starting with e3dc7925618d891347db058640ecadb2c638f892eb37d336b8c8bba38dd5487a not found: ID does not exist" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.513144 4749 scope.go:117] "RemoveContainer" containerID="0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.513477 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\": container with ID starting with 0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6 not found: ID does not exist" containerID="0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.513497 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6"} err="failed to get container status \"0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\": rpc error: code = NotFound desc = could not find container \"0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6\": container with ID starting with 0f8432e0879058ec84eb72c19e34b94874cdf647bc448bf02751d671173a57c6 not found: ID does not exist" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.513508 4749 scope.go:117] "RemoveContainer" containerID="346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.513997 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\": container with ID starting with 346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864 not found: ID does not exist" containerID="346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.514017 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864"} err="failed to get container status \"346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\": rpc error: code = NotFound desc = could not find container \"346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864\": container with ID starting with 346d1b0134537357f248b75593e919202ab19480b593e417b16f4dd16f662864 not found: ID does not exist" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.514033 4749 scope.go:117] "RemoveContainer" containerID="a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.514313 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\": container with ID starting with a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1 not found: ID does not exist" containerID="a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.514349 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1"} err="failed to get container status \"a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\": rpc error: code = NotFound desc = could not find container \"a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1\": container with ID starting with a4c5f7717ff001e2ba3f5da2f66cca5bbb85c659b75a2fe620e4e679a43aade1 not found: ID does not exist" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.514372 4749 scope.go:117] "RemoveContainer" containerID="1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f" Feb 25 07:22:26 crc kubenswrapper[4749]: E0225 07:22:26.514727 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\": container with ID starting with 1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f not found: ID does not exist" containerID="1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f" Feb 25 07:22:26 crc kubenswrapper[4749]: I0225 07:22:26.514751 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f"} err="failed to get container status \"1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\": rpc error: code = NotFound desc = could not find container \"1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f\": container with ID starting with 1d521a1f2f2f04b1bd5f831f10123712b5982b507da6192d7e40e8c531cedd0f not found: ID does not exist" Feb 25 07:22:27 crc kubenswrapper[4749]: I0225 07:22:27.324504 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:27 crc kubenswrapper[4749]: I0225 07:22:27.325184 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:27 crc kubenswrapper[4749]: I0225 07:22:27.329125 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 25 07:22:29 crc kubenswrapper[4749]: I0225 07:22:29.362886 4749 generic.go:334] "Generic (PLEG): container finished" podID="011732e4-f837-46cd-91c1-c05625b9b584" containerID="c357c2321d7ae9491991e96bc1e412dd213c5b5e8c147b263e6e278130c682ba" exitCode=0 Feb 25 07:22:29 crc kubenswrapper[4749]: I0225 07:22:29.362985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"011732e4-f837-46cd-91c1-c05625b9b584","Type":"ContainerDied","Data":"c357c2321d7ae9491991e96bc1e412dd213c5b5e8c147b263e6e278130c682ba"} Feb 25 07:22:29 crc kubenswrapper[4749]: I0225 07:22:29.363901 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:29 crc kubenswrapper[4749]: I0225 07:22:29.364381 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.712940 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.714036 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.714772 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access\") pod \"011732e4-f837-46cd-91c1-c05625b9b584\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock\") pod \"011732e4-f837-46cd-91c1-c05625b9b584\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir\") pod \"011732e4-f837-46cd-91c1-c05625b9b584\" (UID: \"011732e4-f837-46cd-91c1-c05625b9b584\") " Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock" (OuterVolumeSpecName: "var-lock") pod "011732e4-f837-46cd-91c1-c05625b9b584" (UID: "011732e4-f837-46cd-91c1-c05625b9b584"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "011732e4-f837-46cd-91c1-c05625b9b584" (UID: "011732e4-f837-46cd-91c1-c05625b9b584"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814682 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.814722 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/011732e4-f837-46cd-91c1-c05625b9b584-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.820808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "011732e4-f837-46cd-91c1-c05625b9b584" (UID: "011732e4-f837-46cd-91c1-c05625b9b584"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:22:30 crc kubenswrapper[4749]: I0225 07:22:30.915583 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/011732e4-f837-46cd-91c1-c05625b9b584-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 25 07:22:31 crc kubenswrapper[4749]: I0225 07:22:31.379219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"011732e4-f837-46cd-91c1-c05625b9b584","Type":"ContainerDied","Data":"52280efb24081360c2f10a6dc4c277ce04cc1b39973bd459c695a7f7bddf11c2"} Feb 25 07:22:31 crc kubenswrapper[4749]: I0225 07:22:31.379274 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52280efb24081360c2f10a6dc4c277ce04cc1b39973bd459c695a7f7bddf11c2" Feb 25 07:22:31 crc kubenswrapper[4749]: I0225 07:22:31.379287 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 25 07:22:31 crc kubenswrapper[4749]: I0225 07:22:31.384725 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:31 crc kubenswrapper[4749]: I0225 07:22:31.385263 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.484196 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.485288 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.485836 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.486332 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.486912 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:32 crc kubenswrapper[4749]: I0225 07:22:32.486966 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.487397 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Feb 25 07:22:32 crc kubenswrapper[4749]: E0225 07:22:32.688700 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Feb 25 07:22:33 crc kubenswrapper[4749]: E0225 07:22:33.090005 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Feb 25 07:22:33 crc kubenswrapper[4749]: E0225 07:22:33.891523 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Feb 25 07:22:35 crc kubenswrapper[4749]: E0225 07:22:35.260754 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18976c6161848aad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,LastTimestamp:2026-02-25 07:22:23.897283245 +0000 UTC m=+297.259109285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 25 07:22:35 crc kubenswrapper[4749]: E0225 07:22:35.493502 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Feb 25 07:22:37 crc kubenswrapper[4749]: I0225 07:22:37.326750 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:37 crc kubenswrapper[4749]: I0225 07:22:37.327431 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.444170 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.445765 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.445850 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497" exitCode=1 Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.445895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497"} Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.446683 4749 scope.go:117] "RemoveContainer" containerID="f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.447099 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.447801 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.448361 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:38 crc kubenswrapper[4749]: E0225 07:22:38.700611 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="6.4s" Feb 25 07:22:38 crc kubenswrapper[4749]: I0225 07:22:38.835838 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.321999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.323364 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.324013 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.324627 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.345883 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.345933 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:39 crc kubenswrapper[4749]: E0225 07:22:39.346533 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.347141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.456343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f9c9c614715f168888bd2857c7e79cf63178e6ab7c4a03bb2c3052ad744b59e8"} Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.461963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.462913 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.464761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"73d2a413742beca5a25e7d0f86077e097224966e6269a6bc5707a8516700f618"} Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.465717 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.466155 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:39 crc kubenswrapper[4749]: I0225 07:22:39.466816 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.473857 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="778408a2405593eae041047c5f7b0dcc94ec01cdb62617d7e8a4685afe45e168" exitCode=0 Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.473921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"778408a2405593eae041047c5f7b0dcc94ec01cdb62617d7e8a4685afe45e168"} Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.474385 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.474420 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.475040 4749 status_manager.go:851] "Failed to get status for pod" podUID="011732e4-f837-46cd-91c1-c05625b9b584" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:40 crc kubenswrapper[4749]: E0225 07:22:40.475039 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.475425 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:40 crc kubenswrapper[4749]: I0225 07:22:40.475978 4749 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Feb 25 07:22:41 crc kubenswrapper[4749]: I0225 07:22:41.482046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"724adc5b10b22ae24fefa6192b2c894b7ec9ec56dbcd309d20f758edfdf79640"} Feb 25 07:22:41 crc kubenswrapper[4749]: I0225 07:22:41.482548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"40ab2ba3897bc174dedc4e0156538a833dd2b56ff6139b83b0132c925e711b80"} Feb 25 07:22:41 crc kubenswrapper[4749]: I0225 07:22:41.482566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"adccbe4fe7c0ee19d874f731542baa4891bfe6868f7eae13cb14cf66d1e94257"} Feb 25 07:22:42 crc kubenswrapper[4749]: I0225 07:22:42.490628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c3db03450fc1620ae87a76e2ee5d2ecdfa6674d02ea4a481df474e3e32a2bada"} Feb 25 07:22:42 crc kubenswrapper[4749]: I0225 07:22:42.490987 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:42 crc kubenswrapper[4749]: I0225 07:22:42.491005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fce9840f5e7526834560e18399ff70c5f24c2bcda8e542a228ceecbbf141da0c"} Feb 25 07:22:42 crc kubenswrapper[4749]: I0225 07:22:42.490999 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:42 crc kubenswrapper[4749]: I0225 07:22:42.491037 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.347697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.348012 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.356476 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.370098 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.370454 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 25 07:22:44 crc kubenswrapper[4749]: I0225 07:22:44.370545 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 25 07:22:47 crc kubenswrapper[4749]: I0225 07:22:47.508027 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:47 crc kubenswrapper[4749]: I0225 07:22:47.581848 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1852a79f-f32c-4d26-b79f-679f69268035" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.527255 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.527298 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.531701 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1852a79f-f32c-4d26-b79f-679f69268035" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.534766 4749 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://adccbe4fe7c0ee19d874f731542baa4891bfe6868f7eae13cb14cf66d1e94257" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.534802 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:22:48 crc kubenswrapper[4749]: I0225 07:22:48.835694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:22:49 crc kubenswrapper[4749]: I0225 07:22:49.534557 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:49 crc kubenswrapper[4749]: I0225 07:22:49.534645 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6de5aab1-70e2-4c55-8c06-59cc173abb84" Feb 25 07:22:49 crc kubenswrapper[4749]: I0225 07:22:49.539544 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1852a79f-f32c-4d26-b79f-679f69268035" Feb 25 07:22:54 crc kubenswrapper[4749]: I0225 07:22:54.371068 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 25 07:22:54 crc kubenswrapper[4749]: I0225 07:22:54.371844 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 25 07:22:57 crc kubenswrapper[4749]: I0225 07:22:57.095214 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.162960 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.354675 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.512265 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.609245 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.643521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.814755 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.935820 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 25 07:22:58 crc kubenswrapper[4749]: I0225 07:22:58.967384 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.072891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.164859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.284957 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.333090 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.394869 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.584691 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.614475 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.785582 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.948497 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 25 07:22:59 crc kubenswrapper[4749]: I0225 07:22:59.982070 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.094858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.122083 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.156236 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.191527 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.210244 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.256221 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.277678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.597872 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.730300 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.931069 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.947086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 25 07:23:00 crc kubenswrapper[4749]: I0225 07:23:00.993424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.033284 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.043043 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.056685 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.058178 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.089767 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.093385 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.234298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.249095 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.274814 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.364085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.366738 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.407494 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.414255 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.438094 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.466239 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.702453 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.710248 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.722528 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.787999 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.799756 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 25 07:23:01 crc kubenswrapper[4749]: I0225 07:23:01.887096 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.111057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.157424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.257101 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.276958 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.295444 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.311777 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.323434 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.400749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.409558 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.597734 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.671214 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.782991 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.785477 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.807467 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.841387 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.850728 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.922461 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.924214 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.935439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 25 07:23:02 crc kubenswrapper[4749]: I0225 07:23:02.966274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.021659 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.217992 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.255339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.256838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.282820 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.288958 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.301587 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.362391 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.449935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.467031 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.571960 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.585281 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.612720 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.637204 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.689925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.698173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.701953 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.734725 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.774978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.853515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.894300 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 25 07:23:03 crc kubenswrapper[4749]: I0225 07:23:03.995368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.021133 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.274126 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.276313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.279183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.281839 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.288739 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.370423 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.370508 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.370648 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.371497 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"73d2a413742beca5a25e7d0f86077e097224966e6269a6bc5707a8516700f618"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.371734 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://73d2a413742beca5a25e7d0f86077e097224966e6269a6bc5707a8516700f618" gracePeriod=30 Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.416421 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.475258 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.478895 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.509348 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.535430 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.622351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.672788 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.676351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.810424 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 25 07:23:04 crc kubenswrapper[4749]: I0225 07:23:04.931377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.064323 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.096765 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.113724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.134623 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.208036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.248882 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.296555 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.312361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.360464 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.400661 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.439333 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.536714 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.664261 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.758905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.759016 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.841323 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.842807 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.847116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.847175 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.847149954 podStartE2EDuration="42.847149954s" podCreationTimestamp="2026-02-25 07:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:22:47.556942473 +0000 UTC m=+320.918768493" watchObservedRunningTime="2026-02-25 07:23:05.847149954 +0000 UTC m=+339.208976014" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.850075 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.850139 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.858633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.870875 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.870841791 podStartE2EDuration="18.870841791s" podCreationTimestamp="2026-02-25 07:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:23:05.870286267 +0000 UTC m=+339.232112297" watchObservedRunningTime="2026-02-25 07:23:05.870841791 +0000 UTC m=+339.232667851" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.933841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.944615 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 25 07:23:05 crc kubenswrapper[4749]: I0225 07:23:05.972887 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.032081 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.038725 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.038936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.054508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.062403 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.064381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.071967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.092023 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.174672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.197011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.293482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.443549 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.458928 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.536901 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.619738 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.627967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.746374 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.877145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.947824 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 25 07:23:06 crc kubenswrapper[4749]: I0225 07:23:06.956233 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.000032 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.048277 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.060397 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.119718 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.181108 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.198487 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.379786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.388760 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.504917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.527398 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.547532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.555475 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.575011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.577740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.662262 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.670587 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.691686 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.781168 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.809578 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.882346 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 25 07:23:07 crc kubenswrapper[4749]: I0225 07:23:07.952774 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.183688 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.184825 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.218906 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.323893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.398053 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.413833 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.599711 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.739071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.925493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.927803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 25 07:23:08 crc kubenswrapper[4749]: I0225 07:23:08.939498 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.111917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.183780 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.345515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.367102 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.383412 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.409022 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.422138 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.432745 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.557002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.607960 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.612349 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.681615 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.814979 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.865138 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.865405 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3" gracePeriod=5 Feb 25 07:23:09 crc kubenswrapper[4749]: I0225 07:23:09.939690 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.058907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.065381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.189110 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.200071 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.224981 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.321018 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.493893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.665699 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.720699 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.809355 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.835173 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 25 07:23:10 crc kubenswrapper[4749]: I0225 07:23:10.953367 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.134580 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.136685 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.236173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.244429 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.278791 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.395104 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.420452 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.579410 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 25 07:23:11 crc kubenswrapper[4749]: I0225 07:23:11.779486 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.208753 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.328123 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.501536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.549191 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.674369 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.681238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.790684 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 25 07:23:12 crc kubenswrapper[4749]: I0225 07:23:12.916277 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 25 07:23:13 crc kubenswrapper[4749]: I0225 07:23:13.823069 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.467103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.467512 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.522807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.522860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.522925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.522965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.523035 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.523258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.523288 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.523339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.523360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.534015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.624185 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.624237 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.624258 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.624278 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.624297 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.720052 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.720155 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3" exitCode=137 Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.720305 4749 scope.go:117] "RemoveContainer" containerID="fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.720650 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.758871 4749 scope.go:117] "RemoveContainer" containerID="fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3" Feb 25 07:23:15 crc kubenswrapper[4749]: E0225 07:23:15.759506 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3\": container with ID starting with fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3 not found: ID does not exist" containerID="fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3" Feb 25 07:23:15 crc kubenswrapper[4749]: I0225 07:23:15.759742 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3"} err="failed to get container status \"fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3\": rpc error: code = NotFound desc = could not find container \"fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3\": container with ID starting with fdbf6b97bc71721e8eb3b582d096722f234f36351124a68e5b8c9463902d78d3 not found: ID does not exist" Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.336158 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.336884 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.351870 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.351920 4749 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="92a493aa-4c87-48f2-a276-00f995fd2764" Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.358146 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 25 07:23:17 crc kubenswrapper[4749]: I0225 07:23:17.358192 4749 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="92a493aa-4c87-48f2-a276-00f995fd2764" Feb 25 07:23:26 crc kubenswrapper[4749]: I0225 07:23:26.000003 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 25 07:23:27 crc kubenswrapper[4749]: I0225 07:23:27.948794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 25 07:23:27 crc kubenswrapper[4749]: I0225 07:23:27.968314 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 25 07:23:28 crc kubenswrapper[4749]: I0225 07:23:28.533536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 25 07:23:28 crc kubenswrapper[4749]: I0225 07:23:28.876354 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 25 07:23:30 crc kubenswrapper[4749]: I0225 07:23:30.562979 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 25 07:23:30 crc kubenswrapper[4749]: I0225 07:23:30.818788 4749 generic.go:334] "Generic (PLEG): container finished" podID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerID="d29f40878a16488090d33422670ff524a2e1b2c5009b60a5046e69d023b3cfd2" exitCode=0 Feb 25 07:23:30 crc kubenswrapper[4749]: I0225 07:23:30.818831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerDied","Data":"d29f40878a16488090d33422670ff524a2e1b2c5009b60a5046e69d023b3cfd2"} Feb 25 07:23:30 crc kubenswrapper[4749]: I0225 07:23:30.819220 4749 scope.go:117] "RemoveContainer" containerID="d29f40878a16488090d33422670ff524a2e1b2c5009b60a5046e69d023b3cfd2" Feb 25 07:23:31 crc kubenswrapper[4749]: I0225 07:23:31.453832 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 25 07:23:31 crc kubenswrapper[4749]: I0225 07:23:31.827348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerStarted","Data":"65a1e2ef45b2d03c91f2ffecc2205ba554bec6d54abcc62001c2f5d53a4c5d10"} Feb 25 07:23:31 crc kubenswrapper[4749]: I0225 07:23:31.827770 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:23:31 crc kubenswrapper[4749]: I0225 07:23:31.829008 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:23:32 crc kubenswrapper[4749]: I0225 07:23:32.900563 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 25 07:23:33 crc kubenswrapper[4749]: I0225 07:23:33.561708 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 25 07:23:33 crc kubenswrapper[4749]: I0225 07:23:33.799452 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 25 07:23:33 crc kubenswrapper[4749]: I0225 07:23:33.924454 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.228455 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.641634 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.850576 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.854225 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.855242 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.855337 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="73d2a413742beca5a25e7d0f86077e097224966e6269a6bc5707a8516700f618" exitCode=137 Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.855384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"73d2a413742beca5a25e7d0f86077e097224966e6269a6bc5707a8516700f618"} Feb 25 07:23:34 crc kubenswrapper[4749]: I0225 07:23:34.855444 4749 scope.go:117] "RemoveContainer" containerID="f32da62cdfabcc30e1db7d99b4ed54efe8ac5891e90158255a46685556cf7497" Feb 25 07:23:35 crc kubenswrapper[4749]: I0225 07:23:35.470320 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 25 07:23:35 crc kubenswrapper[4749]: I0225 07:23:35.862905 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 25 07:23:35 crc kubenswrapper[4749]: I0225 07:23:35.864229 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 25 07:23:35 crc kubenswrapper[4749]: I0225 07:23:35.864276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7fa8329cf794f409747985a793cde28b3938fc244afe0d0b0e39ab4c2ff06ac8"} Feb 25 07:23:35 crc kubenswrapper[4749]: I0225 07:23:35.903530 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 25 07:23:38 crc kubenswrapper[4749]: I0225 07:23:38.610453 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 25 07:23:38 crc kubenswrapper[4749]: I0225 07:23:38.835949 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:23:39 crc kubenswrapper[4749]: I0225 07:23:39.298315 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 25 07:23:40 crc kubenswrapper[4749]: I0225 07:23:40.903676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 25 07:23:41 crc kubenswrapper[4749]: I0225 07:23:41.584756 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 25 07:23:43 crc kubenswrapper[4749]: I0225 07:23:43.299821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 25 07:23:44 crc kubenswrapper[4749]: I0225 07:23:44.099251 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 25 07:23:44 crc kubenswrapper[4749]: I0225 07:23:44.175370 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 25 07:23:44 crc kubenswrapper[4749]: I0225 07:23:44.370252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:23:44 crc kubenswrapper[4749]: I0225 07:23:44.375829 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:23:48 crc kubenswrapper[4749]: I0225 07:23:48.842212 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 25 07:23:51 crc kubenswrapper[4749]: I0225 07:23:51.451025 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 25 07:23:54 crc kubenswrapper[4749]: I0225 07:23:54.256108 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 25 07:23:55 crc kubenswrapper[4749]: I0225 07:23:55.284801 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.200666 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533404-5d6x5"] Feb 25 07:24:00 crc kubenswrapper[4749]: E0225 07:24:00.201894 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011732e4-f837-46cd-91c1-c05625b9b584" containerName="installer" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.201917 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="011732e4-f837-46cd-91c1-c05625b9b584" containerName="installer" Feb 25 07:24:00 crc kubenswrapper[4749]: E0225 07:24:00.201946 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.201960 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.202153 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="011732e4-f837-46cd-91c1-c05625b9b584" containerName="installer" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.202176 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.202779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.206698 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.206735 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.207083 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.218361 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533404-5d6x5"] Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.325421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqvr4\" (UniqueName: \"kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4\") pod \"auto-csr-approver-29533404-5d6x5\" (UID: \"f11de3f9-29e5-4785-8769-3e1ea90ab4d9\") " pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.429792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqvr4\" (UniqueName: \"kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4\") pod \"auto-csr-approver-29533404-5d6x5\" (UID: \"f11de3f9-29e5-4785-8769-3e1ea90ab4d9\") " pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.469744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqvr4\" (UniqueName: \"kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4\") pod \"auto-csr-approver-29533404-5d6x5\" (UID: \"f11de3f9-29e5-4785-8769-3e1ea90ab4d9\") " pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.531496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:00 crc kubenswrapper[4749]: I0225 07:24:00.982398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533404-5d6x5"] Feb 25 07:24:01 crc kubenswrapper[4749]: I0225 07:24:01.038426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" event={"ID":"f11de3f9-29e5-4785-8769-3e1ea90ab4d9","Type":"ContainerStarted","Data":"d401b737841c66a9d30a827a4763c9cf51862fe0b3d20d91d981a9d428f04e2a"} Feb 25 07:24:03 crc kubenswrapper[4749]: I0225 07:24:03.051407 4749 generic.go:334] "Generic (PLEG): container finished" podID="f11de3f9-29e5-4785-8769-3e1ea90ab4d9" containerID="c533acd952e9dbc20cd9cac9154998e45aaeb53c83c30768a91c3f094ffde5b8" exitCode=0 Feb 25 07:24:03 crc kubenswrapper[4749]: I0225 07:24:03.051477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" event={"ID":"f11de3f9-29e5-4785-8769-3e1ea90ab4d9","Type":"ContainerDied","Data":"c533acd952e9dbc20cd9cac9154998e45aaeb53c83c30768a91c3f094ffde5b8"} Feb 25 07:24:04 crc kubenswrapper[4749]: I0225 07:24:04.381047 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:04 crc kubenswrapper[4749]: I0225 07:24:04.579494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqvr4\" (UniqueName: \"kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4\") pod \"f11de3f9-29e5-4785-8769-3e1ea90ab4d9\" (UID: \"f11de3f9-29e5-4785-8769-3e1ea90ab4d9\") " Feb 25 07:24:04 crc kubenswrapper[4749]: I0225 07:24:04.585232 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4" (OuterVolumeSpecName: "kube-api-access-jqvr4") pod "f11de3f9-29e5-4785-8769-3e1ea90ab4d9" (UID: "f11de3f9-29e5-4785-8769-3e1ea90ab4d9"). InnerVolumeSpecName "kube-api-access-jqvr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:04 crc kubenswrapper[4749]: I0225 07:24:04.680520 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqvr4\" (UniqueName: \"kubernetes.io/projected/f11de3f9-29e5-4785-8769-3e1ea90ab4d9-kube-api-access-jqvr4\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:05 crc kubenswrapper[4749]: I0225 07:24:05.071004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" event={"ID":"f11de3f9-29e5-4785-8769-3e1ea90ab4d9","Type":"ContainerDied","Data":"d401b737841c66a9d30a827a4763c9cf51862fe0b3d20d91d981a9d428f04e2a"} Feb 25 07:24:05 crc kubenswrapper[4749]: I0225 07:24:05.071055 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d401b737841c66a9d30a827a4763c9cf51862fe0b3d20d91d981a9d428f04e2a" Feb 25 07:24:05 crc kubenswrapper[4749]: I0225 07:24:05.071094 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533404-5d6x5" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.365673 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z72bh"] Feb 25 07:24:26 crc kubenswrapper[4749]: E0225 07:24:26.366520 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11de3f9-29e5-4785-8769-3e1ea90ab4d9" containerName="oc" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.366537 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11de3f9-29e5-4785-8769-3e1ea90ab4d9" containerName="oc" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.366715 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11de3f9-29e5-4785-8769-3e1ea90ab4d9" containerName="oc" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.367289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.372354 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z72bh"] Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-certificates\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-trusted-ca\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-bound-sa-token\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkvp\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-kube-api-access-4gkvp\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-tls\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.554704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.577431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.655890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-trusted-ca\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.655939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-bound-sa-token\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.655959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkvp\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-kube-api-access-4gkvp\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.655997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.656015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-tls\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.656039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.656076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-certificates\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.657043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-ca-trust-extracted\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.657561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-certificates\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.658970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-trusted-ca\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.664252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-installation-pull-secrets\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.665491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-registry-tls\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.678061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkvp\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-kube-api-access-4gkvp\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.680046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64-bound-sa-token\") pod \"image-registry-66df7c8f76-z72bh\" (UID: \"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64\") " pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.694929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:26 crc kubenswrapper[4749]: I0225 07:24:26.914106 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-z72bh"] Feb 25 07:24:26 crc kubenswrapper[4749]: W0225 07:24:26.921358 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e8ed41_4f43_4d01_ac5c_bbffcfd5fa64.slice/crio-4fa3a21ecde8520cdecf54dc95b4042df3729d1e3a4a50ed96cc772a89742b03 WatchSource:0}: Error finding container 4fa3a21ecde8520cdecf54dc95b4042df3729d1e3a4a50ed96cc772a89742b03: Status 404 returned error can't find the container with id 4fa3a21ecde8520cdecf54dc95b4042df3729d1e3a4a50ed96cc772a89742b03 Feb 25 07:24:27 crc kubenswrapper[4749]: I0225 07:24:27.232628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" event={"ID":"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64","Type":"ContainerStarted","Data":"c3dee1c7d5c0e1e3d61bbf0c5e92a8a564d568112c783cccbe938e4e022e27c1"} Feb 25 07:24:27 crc kubenswrapper[4749]: I0225 07:24:27.232693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" event={"ID":"07e8ed41-4f43-4d01-ac5c-bbffcfd5fa64","Type":"ContainerStarted","Data":"4fa3a21ecde8520cdecf54dc95b4042df3729d1e3a4a50ed96cc772a89742b03"} Feb 25 07:24:27 crc kubenswrapper[4749]: I0225 07:24:27.232867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:27 crc kubenswrapper[4749]: I0225 07:24:27.262998 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" podStartSLOduration=1.262978785 podStartE2EDuration="1.262978785s" podCreationTimestamp="2026-02-25 07:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:24:27.258556218 +0000 UTC m=+420.620382248" watchObservedRunningTime="2026-02-25 07:24:27.262978785 +0000 UTC m=+420.624804815" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.092266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.093172 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8fzjk" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="registry-server" containerID="cri-o://7781b4539e67b111f9881202cac6c9da17118a0ecd6c1b19300d33c851b9561c" gracePeriod=30 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.111731 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.112368 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nkp47" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="registry-server" containerID="cri-o://af1fb59810b6dae52d8a2a2a5772c89962bf54cfa901a2cba3aeddd79ea21deb" gracePeriod=30 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.120394 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.120584 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.120860 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x464k" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="registry-server" containerID="cri-o://886b026975f2dac30b1a71c4801c3333e1705e479791593946741f718ca6b798" gracePeriod=30 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.127923 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" containerID="cri-o://65a1e2ef45b2d03c91f2ffecc2205ba554bec6d54abcc62001c2f5d53a4c5d10" gracePeriod=30 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.138330 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pk7lm"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.139564 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.149446 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.149855 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cpkr5" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="registry-server" containerID="cri-o://5676e5c058c15e555a4a31919d1fda4117656d1fac0f10d679131957faa16003" gracePeriod=30 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.155540 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pk7lm"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.244376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpnv5\" (UniqueName: \"kubernetes.io/projected/e899a950-b7af-4fe2-b9db-856858e051fc-kube-api-access-kpnv5\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.244837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.244959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.345749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpnv5\" (UniqueName: \"kubernetes.io/projected/e899a950-b7af-4fe2-b9db-856858e051fc-kube-api-access-kpnv5\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.345821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.345884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.347768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.348279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerDied","Data":"5676e5c058c15e555a4a31919d1fda4117656d1fac0f10d679131957faa16003"} Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.348154 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerID="5676e5c058c15e555a4a31919d1fda4117656d1fac0f10d679131957faa16003" exitCode=0 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.352277 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerID="7781b4539e67b111f9881202cac6c9da17118a0ecd6c1b19300d33c851b9561c" exitCode=0 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.352372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerDied","Data":"7781b4539e67b111f9881202cac6c9da17118a0ecd6c1b19300d33c851b9561c"} Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.354340 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerID="af1fb59810b6dae52d8a2a2a5772c89962bf54cfa901a2cba3aeddd79ea21deb" exitCode=0 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.354400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerDied","Data":"af1fb59810b6dae52d8a2a2a5772c89962bf54cfa901a2cba3aeddd79ea21deb"} Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.354674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e899a950-b7af-4fe2-b9db-856858e051fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.355791 4749 generic.go:334] "Generic (PLEG): container finished" podID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerID="65a1e2ef45b2d03c91f2ffecc2205ba554bec6d54abcc62001c2f5d53a4c5d10" exitCode=0 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.355841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerDied","Data":"65a1e2ef45b2d03c91f2ffecc2205ba554bec6d54abcc62001c2f5d53a4c5d10"} Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.355885 4749 scope.go:117] "RemoveContainer" containerID="d29f40878a16488090d33422670ff524a2e1b2c5009b60a5046e69d023b3cfd2" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.359898 4749 generic.go:334] "Generic (PLEG): container finished" podID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerID="886b026975f2dac30b1a71c4801c3333e1705e479791593946741f718ca6b798" exitCode=0 Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.359932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerDied","Data":"886b026975f2dac30b1a71c4801c3333e1705e479791593946741f718ca6b798"} Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.369983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpnv5\" (UniqueName: \"kubernetes.io/projected/e899a950-b7af-4fe2-b9db-856858e051fc-kube-api-access-kpnv5\") pod \"marketplace-operator-79b997595-pk7lm\" (UID: \"e899a950-b7af-4fe2-b9db-856858e051fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.506622 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.640479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.645082 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.649695 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.678580 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.685775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.706774 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-z72bh" Feb 25 07:24:46 crc kubenswrapper[4749]: W0225 07:24:46.725126 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode899a950_b7af_4fe2_b9db_856858e051fc.slice/crio-cfc050b32e9423108ac8f909ab47b5e77b0cf03592c4ec9af41ec8c931cdb10c WatchSource:0}: Error finding container cfc050b32e9423108ac8f909ab47b5e77b0cf03592c4ec9af41ec8c931cdb10c: Status 404 returned error can't find the container with id cfc050b32e9423108ac8f909ab47b5e77b0cf03592c4ec9af41ec8c931cdb10c Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.727578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pk7lm"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content\") pod \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752522 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities\") pod \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5cnk\" (UniqueName: \"kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk\") pod \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\" (UID: \"4ff8f61f-b1e7-4a56-b1aa-f427189de773\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752621 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nfl\" (UniqueName: \"kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl\") pod \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content\") pod \"cd685401-e0d6-42b1-8f37-f981f46c62b8\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities\") pod \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics\") pod \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.752738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities\") pod \"cd685401-e0d6-42b1-8f37-f981f46c62b8\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.753846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca\") pod \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\" (UID: \"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.753966 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbxsl\" (UniqueName: \"kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl\") pod \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.754075 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content\") pod \"7e2f1824-0c70-45fa-8b96-c41047b08d69\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.754145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxln7\" (UniqueName: \"kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7\") pod \"7e2f1824-0c70-45fa-8b96-c41047b08d69\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.754173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content\") pod \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\" (UID: \"8f869be2-b41b-4117-a9b4-ed628ae0d30b\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.754197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities\") pod \"7e2f1824-0c70-45fa-8b96-c41047b08d69\" (UID: \"7e2f1824-0c70-45fa-8b96-c41047b08d69\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.754227 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcl8\" (UniqueName: \"kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8\") pod \"cd685401-e0d6-42b1-8f37-f981f46c62b8\" (UID: \"cd685401-e0d6-42b1-8f37-f981f46c62b8\") " Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.753279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities" (OuterVolumeSpecName: "utilities") pod "4ff8f61f-b1e7-4a56-b1aa-f427189de773" (UID: "4ff8f61f-b1e7-4a56-b1aa-f427189de773"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.757025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities" (OuterVolumeSpecName: "utilities") pod "8f869be2-b41b-4117-a9b4-ed628ae0d30b" (UID: "8f869be2-b41b-4117-a9b4-ed628ae0d30b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.757389 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities" (OuterVolumeSpecName: "utilities") pod "cd685401-e0d6-42b1-8f37-f981f46c62b8" (UID: "cd685401-e0d6-42b1-8f37-f981f46c62b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.757447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities" (OuterVolumeSpecName: "utilities") pod "7e2f1824-0c70-45fa-8b96-c41047b08d69" (UID: "7e2f1824-0c70-45fa-8b96-c41047b08d69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.758555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl" (OuterVolumeSpecName: "kube-api-access-x8nfl") pod "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" (UID: "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf"). InnerVolumeSpecName "kube-api-access-x8nfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.759875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8" (OuterVolumeSpecName: "kube-api-access-4vcl8") pod "cd685401-e0d6-42b1-8f37-f981f46c62b8" (UID: "cd685401-e0d6-42b1-8f37-f981f46c62b8"). InnerVolumeSpecName "kube-api-access-4vcl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.760183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" (UID: "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.766556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" (UID: "28ca9d7e-e240-4db5-a4a2-a3350e23b0cf"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.778800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7" (OuterVolumeSpecName: "kube-api-access-cxln7") pod "7e2f1824-0c70-45fa-8b96-c41047b08d69" (UID: "7e2f1824-0c70-45fa-8b96-c41047b08d69"). InnerVolumeSpecName "kube-api-access-cxln7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.786794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk" (OuterVolumeSpecName: "kube-api-access-f5cnk") pod "4ff8f61f-b1e7-4a56-b1aa-f427189de773" (UID: "4ff8f61f-b1e7-4a56-b1aa-f427189de773"). InnerVolumeSpecName "kube-api-access-f5cnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.799572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f869be2-b41b-4117-a9b4-ed628ae0d30b" (UID: "8f869be2-b41b-4117-a9b4-ed628ae0d30b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.808668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl" (OuterVolumeSpecName: "kube-api-access-jbxsl") pod "8f869be2-b41b-4117-a9b4-ed628ae0d30b" (UID: "8f869be2-b41b-4117-a9b4-ed628ae0d30b"). InnerVolumeSpecName "kube-api-access-jbxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.814262 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.851245 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd685401-e0d6-42b1-8f37-f981f46c62b8" (UID: "cd685401-e0d6-42b1-8f37-f981f46c62b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.851870 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ff8f61f-b1e7-4a56-b1aa-f427189de773" (UID: "4ff8f61f-b1e7-4a56-b1aa-f427189de773"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859259 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcl8\" (UniqueName: \"kubernetes.io/projected/cd685401-e0d6-42b1-8f37-f981f46c62b8-kube-api-access-4vcl8\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859496 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859505 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ff8f61f-b1e7-4a56-b1aa-f427189de773-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859513 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5cnk\" (UniqueName: \"kubernetes.io/projected/4ff8f61f-b1e7-4a56-b1aa-f427189de773-kube-api-access-f5cnk\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nfl\" (UniqueName: \"kubernetes.io/projected/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-kube-api-access-x8nfl\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859532 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859539 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859547 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859555 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd685401-e0d6-42b1-8f37-f981f46c62b8-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859564 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859573 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbxsl\" (UniqueName: \"kubernetes.io/projected/8f869be2-b41b-4117-a9b4-ed628ae0d30b-kube-api-access-jbxsl\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859581 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxln7\" (UniqueName: \"kubernetes.io/projected/7e2f1824-0c70-45fa-8b96-c41047b08d69-kube-api-access-cxln7\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859602 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f869be2-b41b-4117-a9b4-ed628ae0d30b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.859610 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:46 crc kubenswrapper[4749]: I0225 07:24:46.965148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e2f1824-0c70-45fa-8b96-c41047b08d69" (UID: "7e2f1824-0c70-45fa-8b96-c41047b08d69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.062848 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e2f1824-0c70-45fa-8b96-c41047b08d69-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.366717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" event={"ID":"28ca9d7e-e240-4db5-a4a2-a3350e23b0cf","Type":"ContainerDied","Data":"77cfa9dfadfb16c289ffe8374c308b97f0576ecb16b8084362c927faacc5064e"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.367853 4749 scope.go:117] "RemoveContainer" containerID="65a1e2ef45b2d03c91f2ffecc2205ba554bec6d54abcc62001c2f5d53a4c5d10" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.367194 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-djp99" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.369890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x464k" event={"ID":"8f869be2-b41b-4117-a9b4-ed628ae0d30b","Type":"ContainerDied","Data":"1c6a3c79aa8caa68c52d56f1deaed660f0e81538bd5e47edaee2f9a66ff03d0f"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.370036 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x464k" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.374382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" event={"ID":"e899a950-b7af-4fe2-b9db-856858e051fc","Type":"ContainerStarted","Data":"afa2f7f18e1c122bf8c2d0cb5ecbcb823aca543650aafc1a40fa21fd7bbb57d7"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.375348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" event={"ID":"e899a950-b7af-4fe2-b9db-856858e051fc","Type":"ContainerStarted","Data":"cfc050b32e9423108ac8f909ab47b5e77b0cf03592c4ec9af41ec8c931cdb10c"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.375713 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.377388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpkr5" event={"ID":"7e2f1824-0c70-45fa-8b96-c41047b08d69","Type":"ContainerDied","Data":"e0088db6006827c76c5b14c9ffd24aa8d44f85c3f0c5a0197021e74d755bbbd8"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.377430 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpkr5" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.380497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8fzjk" event={"ID":"4ff8f61f-b1e7-4a56-b1aa-f427189de773","Type":"ContainerDied","Data":"72d7f4829b8dd23378a17ccda6bd5845896feef6547eab29086f3693f8845099"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.380580 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8fzjk" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.383104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nkp47" event={"ID":"cd685401-e0d6-42b1-8f37-f981f46c62b8","Type":"ContainerDied","Data":"fda55101487832e2d54d7fcaa999c4996711358c29b6dfbf9daeb60a28fa58c5"} Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.383637 4749 scope.go:117] "RemoveContainer" containerID="886b026975f2dac30b1a71c4801c3333e1705e479791593946741f718ca6b798" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.385136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nkp47" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.389798 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.391276 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.391303 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x464k"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.406311 4749 scope.go:117] "RemoveContainer" containerID="e668762484a92d18135913cbefb9b239ba7b4887dd3d8b0c3c87c34fefc72b93" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.410137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.415003 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-djp99"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.420182 4749 scope.go:117] "RemoveContainer" containerID="6793b7a71491c5ef41ea8267e4732791aaefa54c7f9f43261d418f82e6ef7d80" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.424930 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" podStartSLOduration=1.4249162100000001 podStartE2EDuration="1.42491621s" podCreationTimestamp="2026-02-25 07:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:24:47.424296505 +0000 UTC m=+440.786122525" watchObservedRunningTime="2026-02-25 07:24:47.42491621 +0000 UTC m=+440.786742220" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.442498 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.444368 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nkp47"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.449155 4749 scope.go:117] "RemoveContainer" containerID="5676e5c058c15e555a4a31919d1fda4117656d1fac0f10d679131957faa16003" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.456432 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.465058 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8fzjk"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.467962 4749 scope.go:117] "RemoveContainer" containerID="838ee78d0775f1f4d2216c79b3c3200876868c2f15c669dba019745044ffaaa4" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.468071 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.474400 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cpkr5"] Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.537911 4749 scope.go:117] "RemoveContainer" containerID="df4404d0e2f3aef60f2128a15e67a628a06f0f9f0dbce2df27bf814f2aeee4a1" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.558977 4749 scope.go:117] "RemoveContainer" containerID="7781b4539e67b111f9881202cac6c9da17118a0ecd6c1b19300d33c851b9561c" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.571517 4749 scope.go:117] "RemoveContainer" containerID="896f742f84f19750c6bfa48b08974c4b4d31d50a9db359fddbe46a13627e20fb" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.583258 4749 scope.go:117] "RemoveContainer" containerID="d9fb533efebc17538921e1bc144a9693804a1c5685db2237132286f41e18aedf" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.594712 4749 scope.go:117] "RemoveContainer" containerID="af1fb59810b6dae52d8a2a2a5772c89962bf54cfa901a2cba3aeddd79ea21deb" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.606562 4749 scope.go:117] "RemoveContainer" containerID="f7b732d4aca42f8c73c5e7d4764b394064702e3b364c5be723737698f2017eab" Feb 25 07:24:47 crc kubenswrapper[4749]: I0225 07:24:47.620211 4749 scope.go:117] "RemoveContainer" containerID="120bf757d7501987a3c104e8a4d028c1693b31140be8ac68ff3224c3071b19a4" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.306503 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5q9mx"] Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.306965 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.306978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307007 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307013 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307020 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307026 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307034 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307040 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307049 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307055 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307083 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307089 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307097 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307102 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307111 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307125 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307130 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307161 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307168 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307175 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307180 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307190 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307196 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307208 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="extract-content" Feb 25 07:24:48 crc kubenswrapper[4749]: E0225 07:24:48.307215 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307240 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="extract-utilities" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307346 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307357 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307363 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307371 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307400 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" containerName="registry-server" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.307618 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" containerName="marketplace-operator" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.308181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.311252 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.322871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5q9mx"] Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.380839 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-catalog-content\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.380887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wqk\" (UniqueName: \"kubernetes.io/projected/2a5c6dae-0795-463f-ae4c-f2c3de61483a-kube-api-access-t6wqk\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.380963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-utilities\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.482591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-catalog-content\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.482733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wqk\" (UniqueName: \"kubernetes.io/projected/2a5c6dae-0795-463f-ae4c-f2c3de61483a-kube-api-access-t6wqk\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.482813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-utilities\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.483052 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-catalog-content\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.483980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5c6dae-0795-463f-ae4c-f2c3de61483a-utilities\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.518240 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8bqdr"] Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.521517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.522914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wqk\" (UniqueName: \"kubernetes.io/projected/2a5c6dae-0795-463f-ae4c-f2c3de61483a-kube-api-access-t6wqk\") pod \"redhat-marketplace-5q9mx\" (UID: \"2a5c6dae-0795-463f-ae4c-f2c3de61483a\") " pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.527843 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bqdr"] Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.529137 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.584265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-utilities\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.584323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-catalog-content\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.584363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dqz\" (UniqueName: \"kubernetes.io/projected/424795b7-1cab-4e2e-a3c6-a3d63283910c-kube-api-access-98dqz\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.645376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.685999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-utilities\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.686054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-catalog-content\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.686088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98dqz\" (UniqueName: \"kubernetes.io/projected/424795b7-1cab-4e2e-a3c6-a3d63283910c-kube-api-access-98dqz\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.688273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-utilities\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.688901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/424795b7-1cab-4e2e-a3c6-a3d63283910c-catalog-content\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.711470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dqz\" (UniqueName: \"kubernetes.io/projected/424795b7-1cab-4e2e-a3c6-a3d63283910c-kube-api-access-98dqz\") pod \"certified-operators-8bqdr\" (UID: \"424795b7-1cab-4e2e-a3c6-a3d63283910c\") " pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:48 crc kubenswrapper[4749]: I0225 07:24:48.851265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.008063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bqdr"] Feb 25 07:24:49 crc kubenswrapper[4749]: W0225 07:24:49.011085 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424795b7_1cab_4e2e_a3c6_a3d63283910c.slice/crio-05ec3297cd57278c5393d63c56c975971c081a93d576da31edcfd5ff7c98a62d WatchSource:0}: Error finding container 05ec3297cd57278c5393d63c56c975971c081a93d576da31edcfd5ff7c98a62d: Status 404 returned error can't find the container with id 05ec3297cd57278c5393d63c56c975971c081a93d576da31edcfd5ff7c98a62d Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.127541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5q9mx"] Feb 25 07:24:49 crc kubenswrapper[4749]: W0225 07:24:49.131661 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a5c6dae_0795_463f_ae4c_f2c3de61483a.slice/crio-d6eae47c78ee91480db4938085c55c2fa2fb2547745716ce21425f6ba666c340 WatchSource:0}: Error finding container d6eae47c78ee91480db4938085c55c2fa2fb2547745716ce21425f6ba666c340: Status 404 returned error can't find the container with id d6eae47c78ee91480db4938085c55c2fa2fb2547745716ce21425f6ba666c340 Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.332612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ca9d7e-e240-4db5-a4a2-a3350e23b0cf" path="/var/lib/kubelet/pods/28ca9d7e-e240-4db5-a4a2-a3350e23b0cf/volumes" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.333200 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff8f61f-b1e7-4a56-b1aa-f427189de773" path="/var/lib/kubelet/pods/4ff8f61f-b1e7-4a56-b1aa-f427189de773/volumes" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.333753 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2f1824-0c70-45fa-8b96-c41047b08d69" path="/var/lib/kubelet/pods/7e2f1824-0c70-45fa-8b96-c41047b08d69/volumes" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.334699 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f869be2-b41b-4117-a9b4-ed628ae0d30b" path="/var/lib/kubelet/pods/8f869be2-b41b-4117-a9b4-ed628ae0d30b/volumes" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.335235 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd685401-e0d6-42b1-8f37-f981f46c62b8" path="/var/lib/kubelet/pods/cd685401-e0d6-42b1-8f37-f981f46c62b8/volumes" Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.401353 4749 generic.go:334] "Generic (PLEG): container finished" podID="424795b7-1cab-4e2e-a3c6-a3d63283910c" containerID="7702f779a4cd2935263abd1e2941422aa80bc73ed5960a6cabf4c3903c4625ae" exitCode=0 Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.401431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bqdr" event={"ID":"424795b7-1cab-4e2e-a3c6-a3d63283910c","Type":"ContainerDied","Data":"7702f779a4cd2935263abd1e2941422aa80bc73ed5960a6cabf4c3903c4625ae"} Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.401460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bqdr" event={"ID":"424795b7-1cab-4e2e-a3c6-a3d63283910c","Type":"ContainerStarted","Data":"05ec3297cd57278c5393d63c56c975971c081a93d576da31edcfd5ff7c98a62d"} Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.402820 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a5c6dae-0795-463f-ae4c-f2c3de61483a" containerID="746143a3898dc490d94b5015aa05811c0a62b41e4f23d9c3e134a7350d8d04e0" exitCode=0 Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.403645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5q9mx" event={"ID":"2a5c6dae-0795-463f-ae4c-f2c3de61483a","Type":"ContainerDied","Data":"746143a3898dc490d94b5015aa05811c0a62b41e4f23d9c3e134a7350d8d04e0"} Feb 25 07:24:49 crc kubenswrapper[4749]: I0225 07:24:49.403677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5q9mx" event={"ID":"2a5c6dae-0795-463f-ae4c-f2c3de61483a","Type":"ContainerStarted","Data":"d6eae47c78ee91480db4938085c55c2fa2fb2547745716ce21425f6ba666c340"} Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.414838 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a5c6dae-0795-463f-ae4c-f2c3de61483a" containerID="3b86d32bc9f61497914d584553e79c7c734aa75511cf0791980c3808559cb5c1" exitCode=0 Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.414972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5q9mx" event={"ID":"2a5c6dae-0795-463f-ae4c-f2c3de61483a","Type":"ContainerDied","Data":"3b86d32bc9f61497914d584553e79c7c734aa75511cf0791980c3808559cb5c1"} Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.417378 4749 generic.go:334] "Generic (PLEG): container finished" podID="424795b7-1cab-4e2e-a3c6-a3d63283910c" containerID="81b7f8a7f27dfc9bd57f19484543555ea552f9ccbba1e74829b08e20bdf66f00" exitCode=0 Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.417469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bqdr" event={"ID":"424795b7-1cab-4e2e-a3c6-a3d63283910c","Type":"ContainerDied","Data":"81b7f8a7f27dfc9bd57f19484543555ea552f9ccbba1e74829b08e20bdf66f00"} Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.702376 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zzwzn"] Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.703281 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.707241 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.715146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzwzn"] Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.811213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-catalog-content\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.811473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxkh\" (UniqueName: \"kubernetes.io/projected/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-kube-api-access-bbxkh\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.811608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-utilities\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.904470 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.907350 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.909643 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.912796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-utilities\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.912900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-catalog-content\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.912981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxkh\" (UniqueName: \"kubernetes.io/projected/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-kube-api-access-bbxkh\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.913877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-utilities\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.914040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-catalog-content\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.928543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:24:50 crc kubenswrapper[4749]: I0225 07:24:50.940610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxkh\" (UniqueName: \"kubernetes.io/projected/6a02c53a-d462-4339-9bf9-3dc9fbc48c71-kube-api-access-bbxkh\") pod \"redhat-operators-zzwzn\" (UID: \"6a02c53a-d462-4339-9bf9-3dc9fbc48c71\") " pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.013707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdfd\" (UniqueName: \"kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.013969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.013995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.021988 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.115745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdfd\" (UniqueName: \"kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.115807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.115832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.116685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.116790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.171667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdfd\" (UniqueName: \"kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd\") pod \"community-operators-hv24g\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.242727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.422976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.426496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5q9mx" event={"ID":"2a5c6dae-0795-463f-ae4c-f2c3de61483a","Type":"ContainerStarted","Data":"587e796e9d34bac8adfba3c4523b03f4e0eee880e8b1147b2f6716c98594d839"} Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.430577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bqdr" event={"ID":"424795b7-1cab-4e2e-a3c6-a3d63283910c","Type":"ContainerStarted","Data":"ddf3b42e7a1b648334ab045889074d861dc381d26b00a57de262bab80f64b790"} Feb 25 07:24:51 crc kubenswrapper[4749]: W0225 07:24:51.433052 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45225dff_f438_45e3_bccb_bf2c2d52f4e4.slice/crio-70f63382c5f0899be190de8f3d6a4d82b69b3714aa716f69b6de06da5fba1364 WatchSource:0}: Error finding container 70f63382c5f0899be190de8f3d6a4d82b69b3714aa716f69b6de06da5fba1364: Status 404 returned error can't find the container with id 70f63382c5f0899be190de8f3d6a4d82b69b3714aa716f69b6de06da5fba1364 Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.450876 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5q9mx" podStartSLOduration=1.944004814 podStartE2EDuration="3.450859308s" podCreationTimestamp="2026-02-25 07:24:48 +0000 UTC" firstStartedPulling="2026-02-25 07:24:49.405985913 +0000 UTC m=+442.767811933" lastFinishedPulling="2026-02-25 07:24:50.912840407 +0000 UTC m=+444.274666427" observedRunningTime="2026-02-25 07:24:51.447123997 +0000 UTC m=+444.808950017" watchObservedRunningTime="2026-02-25 07:24:51.450859308 +0000 UTC m=+444.812685328" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.470464 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zzwzn"] Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.475181 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8bqdr" podStartSLOduration=2.042932302 podStartE2EDuration="3.47515698s" podCreationTimestamp="2026-02-25 07:24:48 +0000 UTC" firstStartedPulling="2026-02-25 07:24:49.405180713 +0000 UTC m=+442.767006733" lastFinishedPulling="2026-02-25 07:24:50.837405391 +0000 UTC m=+444.199231411" observedRunningTime="2026-02-25 07:24:51.466838587 +0000 UTC m=+444.828664617" watchObservedRunningTime="2026-02-25 07:24:51.47515698 +0000 UTC m=+444.836983000" Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.671952 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:24:51 crc kubenswrapper[4749]: I0225 07:24:51.672019 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.437786 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a02c53a-d462-4339-9bf9-3dc9fbc48c71" containerID="4867cab2299ef3e2d01856a463fd6535e57acc64b5a6733ed7f8140fc2817f8e" exitCode=0 Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.437869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzwzn" event={"ID":"6a02c53a-d462-4339-9bf9-3dc9fbc48c71","Type":"ContainerDied","Data":"4867cab2299ef3e2d01856a463fd6535e57acc64b5a6733ed7f8140fc2817f8e"} Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.438144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzwzn" event={"ID":"6a02c53a-d462-4339-9bf9-3dc9fbc48c71","Type":"ContainerStarted","Data":"bd497434b96944879415b5668c385e9fb6f22157e970f46c895a65fbf78c82a5"} Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.441736 4749 generic.go:334] "Generic (PLEG): container finished" podID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerID="8f9240ea52ada703203bc660407dc20a5919a12d659ef87b276a1946a7d6ca3a" exitCode=0 Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.441784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerDied","Data":"8f9240ea52ada703203bc660407dc20a5919a12d659ef87b276a1946a7d6ca3a"} Feb 25 07:24:52 crc kubenswrapper[4749]: I0225 07:24:52.441817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerStarted","Data":"70f63382c5f0899be190de8f3d6a4d82b69b3714aa716f69b6de06da5fba1364"} Feb 25 07:24:53 crc kubenswrapper[4749]: I0225 07:24:53.449868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzwzn" event={"ID":"6a02c53a-d462-4339-9bf9-3dc9fbc48c71","Type":"ContainerStarted","Data":"697865eab04457296bf53df6020e61243e7c79b3577f773eae79cd56c90efc11"} Feb 25 07:24:53 crc kubenswrapper[4749]: I0225 07:24:53.451577 4749 generic.go:334] "Generic (PLEG): container finished" podID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerID="256e589630cd3e1825cc54da3104b3bacd1c117b48e9cb758eb053722037d550" exitCode=0 Feb 25 07:24:53 crc kubenswrapper[4749]: I0225 07:24:53.451639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerDied","Data":"256e589630cd3e1825cc54da3104b3bacd1c117b48e9cb758eb053722037d550"} Feb 25 07:24:54 crc kubenswrapper[4749]: I0225 07:24:54.460002 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a02c53a-d462-4339-9bf9-3dc9fbc48c71" containerID="697865eab04457296bf53df6020e61243e7c79b3577f773eae79cd56c90efc11" exitCode=0 Feb 25 07:24:54 crc kubenswrapper[4749]: I0225 07:24:54.460084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzwzn" event={"ID":"6a02c53a-d462-4339-9bf9-3dc9fbc48c71","Type":"ContainerDied","Data":"697865eab04457296bf53df6020e61243e7c79b3577f773eae79cd56c90efc11"} Feb 25 07:24:54 crc kubenswrapper[4749]: I0225 07:24:54.465666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerStarted","Data":"d9ef2253dccea103d56e10bf7cd7a8e4e780273fbcc4482cbdf2c737ac032c0d"} Feb 25 07:24:54 crc kubenswrapper[4749]: I0225 07:24:54.546406 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv24g" podStartSLOduration=2.938692429 podStartE2EDuration="4.546387887s" podCreationTimestamp="2026-02-25 07:24:50 +0000 UTC" firstStartedPulling="2026-02-25 07:24:52.443246924 +0000 UTC m=+445.805072944" lastFinishedPulling="2026-02-25 07:24:54.050942382 +0000 UTC m=+447.412768402" observedRunningTime="2026-02-25 07:24:54.5444289 +0000 UTC m=+447.906254920" watchObservedRunningTime="2026-02-25 07:24:54.546387887 +0000 UTC m=+447.908213907" Feb 25 07:24:55 crc kubenswrapper[4749]: I0225 07:24:55.475730 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zzwzn" event={"ID":"6a02c53a-d462-4339-9bf9-3dc9fbc48c71","Type":"ContainerStarted","Data":"a3cddbc79a18c7c74b3a8b6a4cfe9f70b4513c81fd1c137506ca66a1e115f743"} Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.646325 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.647034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.703794 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.722932 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zzwzn" podStartSLOduration=6.251993888 podStartE2EDuration="8.722909569s" podCreationTimestamp="2026-02-25 07:24:50 +0000 UTC" firstStartedPulling="2026-02-25 07:24:52.440843866 +0000 UTC m=+445.802669886" lastFinishedPulling="2026-02-25 07:24:54.911759547 +0000 UTC m=+448.273585567" observedRunningTime="2026-02-25 07:24:55.491091504 +0000 UTC m=+448.852917544" watchObservedRunningTime="2026-02-25 07:24:58.722909569 +0000 UTC m=+452.084735629" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.852201 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.853425 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:58 crc kubenswrapper[4749]: I0225 07:24:58.907075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:59 crc kubenswrapper[4749]: I0225 07:24:59.543429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8bqdr" Feb 25 07:24:59 crc kubenswrapper[4749]: I0225 07:24:59.556893 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5q9mx" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.023066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.023389 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.243692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.243747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.285009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:25:01 crc kubenswrapper[4749]: I0225 07:25:01.545465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:25:02 crc kubenswrapper[4749]: I0225 07:25:02.076932 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zzwzn" podUID="6a02c53a-d462-4339-9bf9-3dc9fbc48c71" containerName="registry-server" probeResult="failure" output=< Feb 25 07:25:02 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:25:02 crc kubenswrapper[4749]: > Feb 25 07:25:11 crc kubenswrapper[4749]: I0225 07:25:11.081726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:25:11 crc kubenswrapper[4749]: I0225 07:25:11.155827 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zzwzn" Feb 25 07:25:11 crc kubenswrapper[4749]: I0225 07:25:11.853397 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" podUID="159a3e78-678a-495d-9621-d523b52df718" containerName="registry" containerID="cri-o://e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2" gracePeriod=30 Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.304036 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.342968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343034 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5vmh\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.343489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls\") pod \"159a3e78-678a-495d-9621-d523b52df718\" (UID: \"159a3e78-678a-495d-9621-d523b52df718\") " Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.344820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.345913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.353541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh" (OuterVolumeSpecName: "kube-api-access-z5vmh") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "kube-api-access-z5vmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.357935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.364252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.366061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.378429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.382504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "159a3e78-678a-495d-9621-d523b52df718" (UID: "159a3e78-678a-495d-9621-d523b52df718"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.444950 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5vmh\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-kube-api-access-z5vmh\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445014 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445035 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/159a3e78-678a-495d-9621-d523b52df718-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445053 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/159a3e78-678a-495d-9621-d523b52df718-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445071 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445089 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/159a3e78-678a-495d-9621-d523b52df718-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.445107 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/159a3e78-678a-495d-9621-d523b52df718-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.568026 4749 generic.go:334] "Generic (PLEG): container finished" podID="159a3e78-678a-495d-9621-d523b52df718" containerID="e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2" exitCode=0 Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.568103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" event={"ID":"159a3e78-678a-495d-9621-d523b52df718","Type":"ContainerDied","Data":"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2"} Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.568180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" event={"ID":"159a3e78-678a-495d-9621-d523b52df718","Type":"ContainerDied","Data":"e2136c8395d4c11604d9e6eb0f659c0f684532130f2f33fa4f758deec41d78c9"} Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.568210 4749 scope.go:117] "RemoveContainer" containerID="e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.568137 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jkzg5" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.593524 4749 scope.go:117] "RemoveContainer" containerID="e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2" Feb 25 07:25:12 crc kubenswrapper[4749]: E0225 07:25:12.595067 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2\": container with ID starting with e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2 not found: ID does not exist" containerID="e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.595143 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2"} err="failed to get container status \"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2\": rpc error: code = NotFound desc = could not find container \"e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2\": container with ID starting with e2a5cede56ae28e495429760e0ad7a57d0e7efad70dfa2a854c60b1b9ce8f3f2 not found: ID does not exist" Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.628053 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:25:12 crc kubenswrapper[4749]: I0225 07:25:12.635034 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jkzg5"] Feb 25 07:25:13 crc kubenswrapper[4749]: I0225 07:25:13.334077 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159a3e78-678a-495d-9621-d523b52df718" path="/var/lib/kubelet/pods/159a3e78-678a-495d-9621-d523b52df718/volumes" Feb 25 07:25:21 crc kubenswrapper[4749]: I0225 07:25:21.672524 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:25:21 crc kubenswrapper[4749]: I0225 07:25:21.672981 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.671857 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.672626 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.672758 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.673830 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.673932 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905" gracePeriod=600 Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.862647 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905" exitCode=0 Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.862702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905"} Feb 25 07:25:51 crc kubenswrapper[4749]: I0225 07:25:51.862744 4749 scope.go:117] "RemoveContainer" containerID="88cc594d1a982c60febf81846225f8c1ca3b175b9dfdf77669d15bc63730db71" Feb 25 07:25:52 crc kubenswrapper[4749]: I0225 07:25:52.874440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04"} Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.149851 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533406-mbmgt"] Feb 25 07:26:00 crc kubenswrapper[4749]: E0225 07:26:00.150916 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159a3e78-678a-495d-9621-d523b52df718" containerName="registry" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.150940 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="159a3e78-678a-495d-9621-d523b52df718" containerName="registry" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.151097 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="159a3e78-678a-495d-9621-d523b52df718" containerName="registry" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.151879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.154677 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.154919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.155068 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.161412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533406-mbmgt"] Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.161842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75kqx\" (UniqueName: \"kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx\") pod \"auto-csr-approver-29533406-mbmgt\" (UID: \"6a286dbe-25c1-4359-98d0-f79d66828da1\") " pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.263524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75kqx\" (UniqueName: \"kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx\") pod \"auto-csr-approver-29533406-mbmgt\" (UID: \"6a286dbe-25c1-4359-98d0-f79d66828da1\") " pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.296661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75kqx\" (UniqueName: \"kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx\") pod \"auto-csr-approver-29533406-mbmgt\" (UID: \"6a286dbe-25c1-4359-98d0-f79d66828da1\") " pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.493395 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.734338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533406-mbmgt"] Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.745523 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:26:00 crc kubenswrapper[4749]: I0225 07:26:00.953447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" event={"ID":"6a286dbe-25c1-4359-98d0-f79d66828da1","Type":"ContainerStarted","Data":"be818be266d7f403be3fe30634a7ee763510350b47233d57017594a1cb627bc1"} Feb 25 07:26:02 crc kubenswrapper[4749]: I0225 07:26:02.977797 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a286dbe-25c1-4359-98d0-f79d66828da1" containerID="76b5984e3533edb5fec5a64059ed8fcbdb292796c2c8ee1eaebe159be241260d" exitCode=0 Feb 25 07:26:02 crc kubenswrapper[4749]: I0225 07:26:02.977920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" event={"ID":"6a286dbe-25c1-4359-98d0-f79d66828da1","Type":"ContainerDied","Data":"76b5984e3533edb5fec5a64059ed8fcbdb292796c2c8ee1eaebe159be241260d"} Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.304830 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.333346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75kqx\" (UniqueName: \"kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx\") pod \"6a286dbe-25c1-4359-98d0-f79d66828da1\" (UID: \"6a286dbe-25c1-4359-98d0-f79d66828da1\") " Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.341962 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx" (OuterVolumeSpecName: "kube-api-access-75kqx") pod "6a286dbe-25c1-4359-98d0-f79d66828da1" (UID: "6a286dbe-25c1-4359-98d0-f79d66828da1"). InnerVolumeSpecName "kube-api-access-75kqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.435146 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75kqx\" (UniqueName: \"kubernetes.io/projected/6a286dbe-25c1-4359-98d0-f79d66828da1-kube-api-access-75kqx\") on node \"crc\" DevicePath \"\"" Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.994129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" event={"ID":"6a286dbe-25c1-4359-98d0-f79d66828da1","Type":"ContainerDied","Data":"be818be266d7f403be3fe30634a7ee763510350b47233d57017594a1cb627bc1"} Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.994192 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be818be266d7f403be3fe30634a7ee763510350b47233d57017594a1cb627bc1" Feb 25 07:26:04 crc kubenswrapper[4749]: I0225 07:26:04.994243 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533406-mbmgt" Feb 25 07:26:05 crc kubenswrapper[4749]: I0225 07:26:05.397488 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533400-6w2xx"] Feb 25 07:26:05 crc kubenswrapper[4749]: I0225 07:26:05.403550 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533400-6w2xx"] Feb 25 07:26:07 crc kubenswrapper[4749]: I0225 07:26:07.334909 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8700de-dc40-4245-8c99-e792c342b5bb" path="/var/lib/kubelet/pods/2a8700de-dc40-4245-8c99-e792c342b5bb/volumes" Feb 25 07:27:34 crc kubenswrapper[4749]: I0225 07:27:34.176741 4749 scope.go:117] "RemoveContainer" containerID="0e07959c96b3a4a206eef86c40104d0a734d892a43f789b98f202d0c9fd8bbd8" Feb 25 07:27:51 crc kubenswrapper[4749]: I0225 07:27:51.672727 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:27:51 crc kubenswrapper[4749]: I0225 07:27:51.673254 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.149478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533408-bbf9b"] Feb 25 07:28:00 crc kubenswrapper[4749]: E0225 07:28:00.150558 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a286dbe-25c1-4359-98d0-f79d66828da1" containerName="oc" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.150584 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a286dbe-25c1-4359-98d0-f79d66828da1" containerName="oc" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.150777 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a286dbe-25c1-4359-98d0-f79d66828da1" containerName="oc" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.151392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.156478 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.157285 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.157565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.158042 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533408-bbf9b"] Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.163825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qsm\" (UniqueName: \"kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm\") pod \"auto-csr-approver-29533408-bbf9b\" (UID: \"268d7f08-512d-45a3-b577-5f865420f45e\") " pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.264924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qsm\" (UniqueName: \"kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm\") pod \"auto-csr-approver-29533408-bbf9b\" (UID: \"268d7f08-512d-45a3-b577-5f865420f45e\") " pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.301005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qsm\" (UniqueName: \"kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm\") pod \"auto-csr-approver-29533408-bbf9b\" (UID: \"268d7f08-512d-45a3-b577-5f865420f45e\") " pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.476190 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.765816 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533408-bbf9b"] Feb 25 07:28:00 crc kubenswrapper[4749]: I0225 07:28:00.853375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" event={"ID":"268d7f08-512d-45a3-b577-5f865420f45e","Type":"ContainerStarted","Data":"e964f18d40146991a732ae843c3f39916d3663196f61e4f6734f0117ccfceea0"} Feb 25 07:28:02 crc kubenswrapper[4749]: I0225 07:28:02.869238 4749 generic.go:334] "Generic (PLEG): container finished" podID="268d7f08-512d-45a3-b577-5f865420f45e" containerID="1656393c0ad300dce187b340e77b5ac6c9f979b63d1cbb70310968d5e5a2cad4" exitCode=0 Feb 25 07:28:02 crc kubenswrapper[4749]: I0225 07:28:02.869671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" event={"ID":"268d7f08-512d-45a3-b577-5f865420f45e","Type":"ContainerDied","Data":"1656393c0ad300dce187b340e77b5ac6c9f979b63d1cbb70310968d5e5a2cad4"} Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.130293 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.324455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4qsm\" (UniqueName: \"kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm\") pod \"268d7f08-512d-45a3-b577-5f865420f45e\" (UID: \"268d7f08-512d-45a3-b577-5f865420f45e\") " Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.332830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm" (OuterVolumeSpecName: "kube-api-access-f4qsm") pod "268d7f08-512d-45a3-b577-5f865420f45e" (UID: "268d7f08-512d-45a3-b577-5f865420f45e"). InnerVolumeSpecName "kube-api-access-f4qsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.426142 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4qsm\" (UniqueName: \"kubernetes.io/projected/268d7f08-512d-45a3-b577-5f865420f45e-kube-api-access-f4qsm\") on node \"crc\" DevicePath \"\"" Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.896184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" event={"ID":"268d7f08-512d-45a3-b577-5f865420f45e","Type":"ContainerDied","Data":"e964f18d40146991a732ae843c3f39916d3663196f61e4f6734f0117ccfceea0"} Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.896245 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e964f18d40146991a732ae843c3f39916d3663196f61e4f6734f0117ccfceea0" Feb 25 07:28:04 crc kubenswrapper[4749]: I0225 07:28:04.896282 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533408-bbf9b" Feb 25 07:28:05 crc kubenswrapper[4749]: I0225 07:28:05.206522 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533402-x8rfc"] Feb 25 07:28:05 crc kubenswrapper[4749]: I0225 07:28:05.210683 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533402-x8rfc"] Feb 25 07:28:05 crc kubenswrapper[4749]: I0225 07:28:05.330167 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bbf72f-7a88-435b-923c-9b56dfd488c5" path="/var/lib/kubelet/pods/58bbf72f-7a88-435b-923c-9b56dfd488c5/volumes" Feb 25 07:28:21 crc kubenswrapper[4749]: I0225 07:28:21.671413 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:28:21 crc kubenswrapper[4749]: I0225 07:28:21.672859 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:28:34 crc kubenswrapper[4749]: I0225 07:28:34.217801 4749 scope.go:117] "RemoveContainer" containerID="bdb579912f9163bb51d8cd8835dc5eef67975260bf9a8fa3dba0fa89d9c819c5" Feb 25 07:28:34 crc kubenswrapper[4749]: I0225 07:28:34.269267 4749 scope.go:117] "RemoveContainer" containerID="bdc368f7951f633f845f7d9178f5b3e679324ca0ab8a41fe8e6d7de4d8af80f0" Feb 25 07:28:34 crc kubenswrapper[4749]: I0225 07:28:34.295844 4749 scope.go:117] "RemoveContainer" containerID="d91e372cb404b8a650775dd575faf749d114c6f3349d530171b962c23118b0dc" Feb 25 07:28:34 crc kubenswrapper[4749]: I0225 07:28:34.344438 4749 scope.go:117] "RemoveContainer" containerID="a72ce161885e729b31c394e17be6dcc914b30a04ed1fc31736ad35716fe23ed2" Feb 25 07:28:51 crc kubenswrapper[4749]: I0225 07:28:51.671353 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:28:51 crc kubenswrapper[4749]: I0225 07:28:51.671975 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:28:51 crc kubenswrapper[4749]: I0225 07:28:51.672031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:28:51 crc kubenswrapper[4749]: I0225 07:28:51.672681 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:28:51 crc kubenswrapper[4749]: I0225 07:28:51.672741 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04" gracePeriod=600 Feb 25 07:28:52 crc kubenswrapper[4749]: I0225 07:28:52.230348 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04" exitCode=0 Feb 25 07:28:52 crc kubenswrapper[4749]: I0225 07:28:52.230438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04"} Feb 25 07:28:52 crc kubenswrapper[4749]: I0225 07:28:52.231102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13"} Feb 25 07:28:52 crc kubenswrapper[4749]: I0225 07:28:52.231147 4749 scope.go:117] "RemoveContainer" containerID="407c1941a019a95f57992808eaf8683f696568a6bc9c34a66b12c2456ec0f905" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.490850 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rtblf"] Feb 25 07:29:33 crc kubenswrapper[4749]: E0225 07:29:33.493952 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268d7f08-512d-45a3-b577-5f865420f45e" containerName="oc" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.493990 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="268d7f08-512d-45a3-b577-5f865420f45e" containerName="oc" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.494076 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="268d7f08-512d-45a3-b577-5f865420f45e" containerName="oc" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.494462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.497243 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5x9w8" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.497267 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.497559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.503196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rtblf"] Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.512425 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-6858f"] Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.513156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6858f" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.513771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbl94\" (UniqueName: \"kubernetes.io/projected/e03b662d-9431-4217-b126-b2a7db9ab5e4-kube-api-access-hbl94\") pod \"cert-manager-cainjector-cf98fcc89-rtblf\" (UID: \"e03b662d-9431-4217-b126-b2a7db9ab5e4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.516572 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-kwwnd" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.526154 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6858f"] Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.531176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h5fwj"] Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.531866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.535343 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zs2m9" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.546338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h5fwj"] Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.615518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65scw\" (UniqueName: \"kubernetes.io/projected/acd8dc37-5680-4838-aa0e-bf79c4209283-kube-api-access-65scw\") pod \"cert-manager-webhook-687f57d79b-h5fwj\" (UID: \"acd8dc37-5680-4838-aa0e-bf79c4209283\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.615565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbl94\" (UniqueName: \"kubernetes.io/projected/e03b662d-9431-4217-b126-b2a7db9ab5e4-kube-api-access-hbl94\") pod \"cert-manager-cainjector-cf98fcc89-rtblf\" (UID: \"e03b662d-9431-4217-b126-b2a7db9ab5e4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.615635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jrq\" (UniqueName: \"kubernetes.io/projected/280567a7-b82f-4767-93b7-725ad0ff927e-kube-api-access-r4jrq\") pod \"cert-manager-858654f9db-6858f\" (UID: \"280567a7-b82f-4767-93b7-725ad0ff927e\") " pod="cert-manager/cert-manager-858654f9db-6858f" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.634111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbl94\" (UniqueName: \"kubernetes.io/projected/e03b662d-9431-4217-b126-b2a7db9ab5e4-kube-api-access-hbl94\") pod \"cert-manager-cainjector-cf98fcc89-rtblf\" (UID: \"e03b662d-9431-4217-b126-b2a7db9ab5e4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.717105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65scw\" (UniqueName: \"kubernetes.io/projected/acd8dc37-5680-4838-aa0e-bf79c4209283-kube-api-access-65scw\") pod \"cert-manager-webhook-687f57d79b-h5fwj\" (UID: \"acd8dc37-5680-4838-aa0e-bf79c4209283\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.717482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jrq\" (UniqueName: \"kubernetes.io/projected/280567a7-b82f-4767-93b7-725ad0ff927e-kube-api-access-r4jrq\") pod \"cert-manager-858654f9db-6858f\" (UID: \"280567a7-b82f-4767-93b7-725ad0ff927e\") " pod="cert-manager/cert-manager-858654f9db-6858f" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.733289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65scw\" (UniqueName: \"kubernetes.io/projected/acd8dc37-5680-4838-aa0e-bf79c4209283-kube-api-access-65scw\") pod \"cert-manager-webhook-687f57d79b-h5fwj\" (UID: \"acd8dc37-5680-4838-aa0e-bf79c4209283\") " pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.741455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jrq\" (UniqueName: \"kubernetes.io/projected/280567a7-b82f-4767-93b7-725ad0ff927e-kube-api-access-r4jrq\") pod \"cert-manager-858654f9db-6858f\" (UID: \"280567a7-b82f-4767-93b7-725ad0ff927e\") " pod="cert-manager/cert-manager-858654f9db-6858f" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.820255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.840150 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-6858f" Feb 25 07:29:33 crc kubenswrapper[4749]: I0225 07:29:33.859471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.279840 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-6858f"] Feb 25 07:29:34 crc kubenswrapper[4749]: W0225 07:29:34.390463 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03b662d_9431_4217_b126_b2a7db9ab5e4.slice/crio-efd1611cc76e5404a7baa5f1ad9ed20e9b4afe66b7dee228e00b84db6471d499 WatchSource:0}: Error finding container efd1611cc76e5404a7baa5f1ad9ed20e9b4afe66b7dee228e00b84db6471d499: Status 404 returned error can't find the container with id efd1611cc76e5404a7baa5f1ad9ed20e9b4afe66b7dee228e00b84db6471d499 Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.390523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-rtblf"] Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.395005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-h5fwj"] Feb 25 07:29:34 crc kubenswrapper[4749]: W0225 07:29:34.399846 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacd8dc37_5680_4838_aa0e_bf79c4209283.slice/crio-a4176f00854120a11354fb4fd21af66e65072bd33ece217b440c36ddfb8c6549 WatchSource:0}: Error finding container a4176f00854120a11354fb4fd21af66e65072bd33ece217b440c36ddfb8c6549: Status 404 returned error can't find the container with id a4176f00854120a11354fb4fd21af66e65072bd33ece217b440c36ddfb8c6549 Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.504443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6858f" event={"ID":"280567a7-b82f-4767-93b7-725ad0ff927e","Type":"ContainerStarted","Data":"b55ae4d7adf27274a9cab1d3ec14f3854c422a99acf2416af60d8a523060befd"} Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.505509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" event={"ID":"e03b662d-9431-4217-b126-b2a7db9ab5e4","Type":"ContainerStarted","Data":"efd1611cc76e5404a7baa5f1ad9ed20e9b4afe66b7dee228e00b84db6471d499"} Feb 25 07:29:34 crc kubenswrapper[4749]: I0225 07:29:34.506385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" event={"ID":"acd8dc37-5680-4838-aa0e-bf79c4209283","Type":"ContainerStarted","Data":"a4176f00854120a11354fb4fd21af66e65072bd33ece217b440c36ddfb8c6549"} Feb 25 07:29:38 crc kubenswrapper[4749]: I0225 07:29:38.529128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-6858f" event={"ID":"280567a7-b82f-4767-93b7-725ad0ff927e","Type":"ContainerStarted","Data":"02ee3540e947ee3fc6588fd1d4f104b3ff7027686769f67d4a9ce5797094e6d0"} Feb 25 07:29:38 crc kubenswrapper[4749]: I0225 07:29:38.532124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" event={"ID":"e03b662d-9431-4217-b126-b2a7db9ab5e4","Type":"ContainerStarted","Data":"0e82e51b73010ca10903551cb3240f5a23d8428b5dfdfe079335abdc9253ec01"} Feb 25 07:29:38 crc kubenswrapper[4749]: I0225 07:29:38.547623 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-6858f" podStartSLOduration=1.8050365560000001 podStartE2EDuration="5.547602605s" podCreationTimestamp="2026-02-25 07:29:33 +0000 UTC" firstStartedPulling="2026-02-25 07:29:34.289805962 +0000 UTC m=+727.651631992" lastFinishedPulling="2026-02-25 07:29:38.032372001 +0000 UTC m=+731.394198041" observedRunningTime="2026-02-25 07:29:38.543691381 +0000 UTC m=+731.905517401" watchObservedRunningTime="2026-02-25 07:29:38.547602605 +0000 UTC m=+731.909428625" Feb 25 07:29:38 crc kubenswrapper[4749]: I0225 07:29:38.559608 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-rtblf" podStartSLOduration=1.92213709 podStartE2EDuration="5.559563375s" podCreationTimestamp="2026-02-25 07:29:33 +0000 UTC" firstStartedPulling="2026-02-25 07:29:34.392758503 +0000 UTC m=+727.754584523" lastFinishedPulling="2026-02-25 07:29:38.030184778 +0000 UTC m=+731.392010808" observedRunningTime="2026-02-25 07:29:38.558847697 +0000 UTC m=+731.920673707" watchObservedRunningTime="2026-02-25 07:29:38.559563375 +0000 UTC m=+731.921389395" Feb 25 07:29:39 crc kubenswrapper[4749]: I0225 07:29:39.542458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" event={"ID":"acd8dc37-5680-4838-aa0e-bf79c4209283","Type":"ContainerStarted","Data":"6b27d99a908d385532f53a1dc0d3b46d64f1e7db3053a95001791378d6bcc333"} Feb 25 07:29:39 crc kubenswrapper[4749]: I0225 07:29:39.567425 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" podStartSLOduration=2.063345554 podStartE2EDuration="6.567377775s" podCreationTimestamp="2026-02-25 07:29:33 +0000 UTC" firstStartedPulling="2026-02-25 07:29:34.40218162 +0000 UTC m=+727.764007660" lastFinishedPulling="2026-02-25 07:29:38.906213861 +0000 UTC m=+732.268039881" observedRunningTime="2026-02-25 07:29:39.560744375 +0000 UTC m=+732.922570405" watchObservedRunningTime="2026-02-25 07:29:39.567377775 +0000 UTC m=+732.929203835" Feb 25 07:29:40 crc kubenswrapper[4749]: I0225 07:29:40.550884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.586295 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r9pzm"] Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.587805 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-controller" containerID="cri-o://32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588227 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588301 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-acl-logging" containerID="cri-o://a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588317 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="sbdb" containerID="cri-o://3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588412 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="nbdb" containerID="cri-o://9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588413 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-node" containerID="cri-o://e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.588241 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="northd" containerID="cri-o://e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.659330 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" containerID="cri-o://ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" gracePeriod=30 Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.977310 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/3.log" Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.979669 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovn-acl-logging/0.log" Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.980146 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovn-controller/0.log" Feb 25 07:29:43 crc kubenswrapper[4749]: I0225 07:29:43.980839 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.041661 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zgv58"] Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.041991 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042006 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042016 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-acl-logging" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042023 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-acl-logging" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042030 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="nbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042035 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="nbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042044 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042050 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042057 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042063 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042070 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042075 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042084 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="northd" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042090 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="northd" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042098 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kubecfg-setup" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042104 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kubecfg-setup" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042110 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042123 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="sbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042128 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="sbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042135 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042142 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042153 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-node" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042159 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-node" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.042166 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042172 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042269 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042277 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042284 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="nbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042293 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-acl-logging" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042299 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="northd" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042307 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042318 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-node" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042327 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="kube-rbac-proxy-ovn-metrics" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042335 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="sbdb" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042344 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovn-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042352 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.042546 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerName="ovnkube-controller" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.044046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145500 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.145714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146118 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146177 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146217 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket" (OuterVolumeSpecName: "log-socket") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146323 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klvlz\" (UniqueName: \"kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log" (OuterVolumeSpecName: "node-log") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash\") pod \"fae19e32-92e3-446f-9a38-85e8fef239dd\" (UID: \"fae19e32-92e3-446f-9a38-85e8fef239dd\") " Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash" (OuterVolumeSpecName: "host-slash") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-kubelet\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-systemd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-env-overrides\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-slash\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64134e92-9e00-484d-a677-85cd6e167544-ovn-node-metrics-cert\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-etc-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-var-lib-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-script-lib\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.146991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-node-log\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-netd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdd8n\" (UniqueName: \"kubernetes.io/projected/64134e92-9e00-484d-a677-85cd6e167544-kube-api-access-hdd8n\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-config\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-ovn\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-systemd-units\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-bin\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-netns\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147191 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-log-socket\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147264 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147276 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-slash\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147287 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147296 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147305 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147314 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147322 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147330 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-log-socket\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147338 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147333 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147347 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147392 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-node-log\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147408 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147423 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147437 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147452 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.147469 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.153400 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.153465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz" (OuterVolumeSpecName: "kube-api-access-klvlz") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "kube-api-access-klvlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.163114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fae19e32-92e3-446f-9a38-85e8fef239dd" (UID: "fae19e32-92e3-446f-9a38-85e8fef239dd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.247887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-bin\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.247943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-netns\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.247966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.247996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-log-socket\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-kubelet\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-systemd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-log-socket\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-env-overrides\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-systemd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-run-netns\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248227 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-kubelet\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-slash\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248019 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-bin\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64134e92-9e00-484d-a677-85cd6e167544-ovn-node-metrics-cert\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-etc-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-var-lib-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-script-lib\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-netd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-var-lib-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248628 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-node-log\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-etc-openvswitch\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdd8n\" (UniqueName: \"kubernetes.io/projected/64134e92-9e00-484d-a677-85cd6e167544-kube-api-access-hdd8n\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-env-overrides\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-config\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-cni-netd\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-node-log\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-ovn\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.248908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-run-ovn\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-systemd-units\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-systemd-units\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249189 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klvlz\" (UniqueName: \"kubernetes.io/projected/fae19e32-92e3-446f-9a38-85e8fef239dd-kube-api-access-klvlz\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-script-lib\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249221 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae19e32-92e3-446f-9a38-85e8fef239dd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249264 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae19e32-92e3-446f-9a38-85e8fef239dd-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249278 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae19e32-92e3-446f-9a38-85e8fef239dd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/64134e92-9e00-484d-a677-85cd6e167544-ovnkube-config\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.249662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64134e92-9e00-484d-a677-85cd6e167544-host-slash\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.253144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/64134e92-9e00-484d-a677-85cd6e167544-ovn-node-metrics-cert\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.280157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdd8n\" (UniqueName: \"kubernetes.io/projected/64134e92-9e00-484d-a677-85cd6e167544-kube-api-access-hdd8n\") pod \"ovnkube-node-zgv58\" (UID: \"64134e92-9e00-484d-a677-85cd6e167544\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.364507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.577760 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/2.log" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.578479 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/1.log" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.578522 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c23d4e-91a8-4374-84dc-7bdc7450661d" containerID="87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88" exitCode=2 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.578584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerDied","Data":"87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.578638 4749 scope.go:117] "RemoveContainer" containerID="f0d74dbf53b0a574c0f75285b2ec956d3d5aadba66620b710c924de3408208b5" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.579089 4749 scope.go:117] "RemoveContainer" containerID="87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.579274 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bkmjf_openshift-multus(21c23d4e-91a8-4374-84dc-7bdc7450661d)\"" pod="openshift-multus/multus-bkmjf" podUID="21c23d4e-91a8-4374-84dc-7bdc7450661d" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.583752 4749 generic.go:334] "Generic (PLEG): container finished" podID="64134e92-9e00-484d-a677-85cd6e167544" containerID="4ebff9086f627d99b5405e08b6255c9b2dc592f682dc08764a5f5f92c65b9bd3" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.583844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerDied","Data":"4ebff9086f627d99b5405e08b6255c9b2dc592f682dc08764a5f5f92c65b9bd3"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.583872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"1cf311526289739169fd4b0d810a1900d78bc0d116366223c90ad963b9f17a2d"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.586966 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovnkube-controller/3.log" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.591743 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovn-acl-logging/0.log" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592221 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r9pzm_fae19e32-92e3-446f-9a38-85e8fef239dd/ovn-controller/0.log" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592529 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592551 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592560 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592569 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592576 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592583 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" exitCode=0 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592631 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" exitCode=143 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592642 4749 generic.go:334] "Generic (PLEG): container finished" podID="fae19e32-92e3-446f-9a38-85e8fef239dd" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" exitCode=143 Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592699 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592711 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592742 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592751 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592756 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592762 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592768 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592774 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592782 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592788 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592794 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592801 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592803 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592922 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592931 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592940 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592946 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592952 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592958 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592964 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592970 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592977 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592983 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.592992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593003 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593011 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593018 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593024 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593031 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593037 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593045 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593051 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593057 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593063 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r9pzm" event={"ID":"fae19e32-92e3-446f-9a38-85e8fef239dd","Type":"ContainerDied","Data":"26dd9b1ef52806d951dea6a445f2af47160b2cb5cca7faf401a2e09c36176d44"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593079 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593088 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593095 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593101 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593108 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593114 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593121 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593127 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593133 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.593139 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.608633 4749 scope.go:117] "RemoveContainer" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.622917 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.659727 4749 scope.go:117] "RemoveContainer" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.660779 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r9pzm"] Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.664153 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-r9pzm"] Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.678234 4749 scope.go:117] "RemoveContainer" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.695496 4749 scope.go:117] "RemoveContainer" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.710122 4749 scope.go:117] "RemoveContainer" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.786684 4749 scope.go:117] "RemoveContainer" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.803961 4749 scope.go:117] "RemoveContainer" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.825384 4749 scope.go:117] "RemoveContainer" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.839457 4749 scope.go:117] "RemoveContainer" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.856484 4749 scope.go:117] "RemoveContainer" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.856909 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": container with ID starting with ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04 not found: ID does not exist" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.857021 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} err="failed to get container status \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": rpc error: code = NotFound desc = could not find container \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": container with ID starting with ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.857056 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.857420 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": container with ID starting with cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95 not found: ID does not exist" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.857445 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} err="failed to get container status \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": rpc error: code = NotFound desc = could not find container \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": container with ID starting with cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.857459 4749 scope.go:117] "RemoveContainer" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.858000 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": container with ID starting with 3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7 not found: ID does not exist" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.858043 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} err="failed to get container status \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": rpc error: code = NotFound desc = could not find container \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": container with ID starting with 3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.858072 4749 scope.go:117] "RemoveContainer" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.858732 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": container with ID starting with 9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358 not found: ID does not exist" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.858759 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} err="failed to get container status \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": rpc error: code = NotFound desc = could not find container \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": container with ID starting with 9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.858777 4749 scope.go:117] "RemoveContainer" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.859180 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": container with ID starting with e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2 not found: ID does not exist" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.859217 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} err="failed to get container status \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": rpc error: code = NotFound desc = could not find container \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": container with ID starting with e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.859245 4749 scope.go:117] "RemoveContainer" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.859653 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": container with ID starting with 39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593 not found: ID does not exist" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.859681 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} err="failed to get container status \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": rpc error: code = NotFound desc = could not find container \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": container with ID starting with 39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.859698 4749 scope.go:117] "RemoveContainer" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.860340 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": container with ID starting with e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10 not found: ID does not exist" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.860368 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} err="failed to get container status \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": rpc error: code = NotFound desc = could not find container \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": container with ID starting with e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.860384 4749 scope.go:117] "RemoveContainer" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.860856 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": container with ID starting with a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a not found: ID does not exist" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.860884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} err="failed to get container status \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": rpc error: code = NotFound desc = could not find container \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": container with ID starting with a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.860901 4749 scope.go:117] "RemoveContainer" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.861250 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": container with ID starting with 32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a not found: ID does not exist" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861286 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} err="failed to get container status \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": rpc error: code = NotFound desc = could not find container \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": container with ID starting with 32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861332 4749 scope.go:117] "RemoveContainer" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: E0225 07:29:44.861632 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": container with ID starting with 220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f not found: ID does not exist" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861665 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} err="failed to get container status \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": rpc error: code = NotFound desc = could not find container \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": container with ID starting with 220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861712 4749 scope.go:117] "RemoveContainer" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861952 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} err="failed to get container status \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": rpc error: code = NotFound desc = could not find container \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": container with ID starting with ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.861983 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.862233 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} err="failed to get container status \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": rpc error: code = NotFound desc = could not find container \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": container with ID starting with cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.862256 4749 scope.go:117] "RemoveContainer" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.862472 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} err="failed to get container status \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": rpc error: code = NotFound desc = could not find container \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": container with ID starting with 3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.862494 4749 scope.go:117] "RemoveContainer" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.864441 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} err="failed to get container status \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": rpc error: code = NotFound desc = could not find container \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": container with ID starting with 9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.864490 4749 scope.go:117] "RemoveContainer" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.864858 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} err="failed to get container status \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": rpc error: code = NotFound desc = could not find container \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": container with ID starting with e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.864884 4749 scope.go:117] "RemoveContainer" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865122 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} err="failed to get container status \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": rpc error: code = NotFound desc = could not find container \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": container with ID starting with 39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865166 4749 scope.go:117] "RemoveContainer" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865390 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} err="failed to get container status \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": rpc error: code = NotFound desc = could not find container \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": container with ID starting with e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865412 4749 scope.go:117] "RemoveContainer" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865658 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} err="failed to get container status \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": rpc error: code = NotFound desc = could not find container \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": container with ID starting with a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865701 4749 scope.go:117] "RemoveContainer" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865883 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} err="failed to get container status \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": rpc error: code = NotFound desc = could not find container \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": container with ID starting with 32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.865903 4749 scope.go:117] "RemoveContainer" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866112 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} err="failed to get container status \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": rpc error: code = NotFound desc = could not find container \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": container with ID starting with 220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866154 4749 scope.go:117] "RemoveContainer" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866373 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} err="failed to get container status \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": rpc error: code = NotFound desc = could not find container \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": container with ID starting with ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866397 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866680 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} err="failed to get container status \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": rpc error: code = NotFound desc = could not find container \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": container with ID starting with cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866704 4749 scope.go:117] "RemoveContainer" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.866972 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} err="failed to get container status \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": rpc error: code = NotFound desc = could not find container \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": container with ID starting with 3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867013 4749 scope.go:117] "RemoveContainer" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867219 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} err="failed to get container status \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": rpc error: code = NotFound desc = could not find container \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": container with ID starting with 9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867239 4749 scope.go:117] "RemoveContainer" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867447 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} err="failed to get container status \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": rpc error: code = NotFound desc = could not find container \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": container with ID starting with e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867472 4749 scope.go:117] "RemoveContainer" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867718 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} err="failed to get container status \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": rpc error: code = NotFound desc = could not find container \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": container with ID starting with 39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867761 4749 scope.go:117] "RemoveContainer" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.867990 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} err="failed to get container status \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": rpc error: code = NotFound desc = could not find container \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": container with ID starting with e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868012 4749 scope.go:117] "RemoveContainer" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868204 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} err="failed to get container status \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": rpc error: code = NotFound desc = could not find container \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": container with ID starting with a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868223 4749 scope.go:117] "RemoveContainer" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868401 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} err="failed to get container status \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": rpc error: code = NotFound desc = could not find container \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": container with ID starting with 32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868445 4749 scope.go:117] "RemoveContainer" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868668 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} err="failed to get container status \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": rpc error: code = NotFound desc = could not find container \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": container with ID starting with 220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868688 4749 scope.go:117] "RemoveContainer" containerID="ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868879 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04"} err="failed to get container status \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": rpc error: code = NotFound desc = could not find container \"ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04\": container with ID starting with ff8786e527d30353841802a57e236b0aeb4805a318924714f9c6e19445f59b04 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.868921 4749 scope.go:117] "RemoveContainer" containerID="cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.869096 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95"} err="failed to get container status \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": rpc error: code = NotFound desc = could not find container \"cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95\": container with ID starting with cea067bd8373362f83f58080897cdebe29fd27b371818f7768cbda170af36c95 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.869113 4749 scope.go:117] "RemoveContainer" containerID="3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.870739 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7"} err="failed to get container status \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": rpc error: code = NotFound desc = could not find container \"3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7\": container with ID starting with 3761b1ec6c012585cb8ef458255c34d78a8bf43eb3cfa28196b402ea269560d7 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.870785 4749 scope.go:117] "RemoveContainer" containerID="9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871058 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358"} err="failed to get container status \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": rpc error: code = NotFound desc = could not find container \"9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358\": container with ID starting with 9a5839752d05e98645c534cf79d44a76d6e5f42b34b702e5b1ea73dcc9c7c358 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871076 4749 scope.go:117] "RemoveContainer" containerID="e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871297 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2"} err="failed to get container status \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": rpc error: code = NotFound desc = could not find container \"e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2\": container with ID starting with e4e6ffb5eff13bf3a271f9578a9f3be42f02f7f325aaf77b0f0309da01ead1e2 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871341 4749 scope.go:117] "RemoveContainer" containerID="39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871526 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593"} err="failed to get container status \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": rpc error: code = NotFound desc = could not find container \"39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593\": container with ID starting with 39532cd3ff4b21592485e4c4ab3ea22e7b8aae5f7a9ed995fe95f170c7fdc593 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871544 4749 scope.go:117] "RemoveContainer" containerID="e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871771 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10"} err="failed to get container status \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": rpc error: code = NotFound desc = could not find container \"e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10\": container with ID starting with e9f6744883d397d819fd5aedabc671b9117a5843b31b935ed9ffaefbf7b19b10 not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871812 4749 scope.go:117] "RemoveContainer" containerID="a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871979 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a"} err="failed to get container status \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": rpc error: code = NotFound desc = could not find container \"a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a\": container with ID starting with a7e7945e73a5f18cd8609f7d5f7d7a33d8c814a25997d8e90ad32289b184f70a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.871997 4749 scope.go:117] "RemoveContainer" containerID="32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.872191 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a"} err="failed to get container status \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": rpc error: code = NotFound desc = could not find container \"32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a\": container with ID starting with 32258177b4894bb2252f37ef77d4cb8e091f319b0eb5ac452e7e6480653f980a not found: ID does not exist" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.872232 4749 scope.go:117] "RemoveContainer" containerID="220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f" Feb 25 07:29:44 crc kubenswrapper[4749]: I0225 07:29:44.872469 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f"} err="failed to get container status \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": rpc error: code = NotFound desc = could not find container \"220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f\": container with ID starting with 220fa7132db7c3fdf8e6997f1773819e0a1f88eda3e9dbdc54af01f75c1dd74f not found: ID does not exist" Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.330801 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae19e32-92e3-446f-9a38-85e8fef239dd" path="/var/lib/kubelet/pods/fae19e32-92e3-446f-9a38-85e8fef239dd/volumes" Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"506ab72e1773d2b45bf6df50fea0232a037b80de4f617c7db357756cec8bd0aa"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"03095dde35e0499ff6e4a089a4ad7e97b1b40c19e92d478dbd5bbda351c80136"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"3b648d68f4a91bdb67bbd614429fd86b2e3f88b837b7b2fb83356380997c0843"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"f3456ca02200826db4a3b92e2b8445aaaf2d914b5a4cfa79e49e587477ef81ff"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"ea68cde25cfaf170312e8737ab342608d2a7a2a7ec27cb2403ab762202fc4973"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.599197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"5200b6ba47549fea3ab11cb5553f15e4d316759bb10d69df3d1f254a74e1dc10"} Feb 25 07:29:45 crc kubenswrapper[4749]: I0225 07:29:45.601319 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/2.log" Feb 25 07:29:47 crc kubenswrapper[4749]: I0225 07:29:47.621140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"d714027ee017823875a47e0766a7f1c3cab2c1da89156e52cce389139145fd96"} Feb 25 07:29:48 crc kubenswrapper[4749]: I0225 07:29:48.865058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-h5fwj" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.647354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" event={"ID":"64134e92-9e00-484d-a677-85cd6e167544","Type":"ContainerStarted","Data":"71860317ff8c34a18f14403e43b23dc22776fdb3f1e4c84a431ec0d5792fe81d"} Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.647835 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.647878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.647890 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.683466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.686808 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" podStartSLOduration=6.686781613 podStartE2EDuration="6.686781613s" podCreationTimestamp="2026-02-25 07:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:29:50.683528804 +0000 UTC m=+744.045354824" watchObservedRunningTime="2026-02-25 07:29:50.686781613 +0000 UTC m=+744.048607683" Feb 25 07:29:50 crc kubenswrapper[4749]: I0225 07:29:50.695292 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:29:57 crc kubenswrapper[4749]: I0225 07:29:57.327144 4749 scope.go:117] "RemoveContainer" containerID="87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88" Feb 25 07:29:57 crc kubenswrapper[4749]: E0225 07:29:57.330113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bkmjf_openshift-multus(21c23d4e-91a8-4374-84dc-7bdc7450661d)\"" pod="openshift-multus/multus-bkmjf" podUID="21c23d4e-91a8-4374-84dc-7bdc7450661d" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.146092 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533410-8sgtm"] Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.147970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.149991 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.151202 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd"] Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.151612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.152023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.154168 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.154460 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.154467 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.156351 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533410-8sgtm"] Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.161732 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd"] Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.269934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflqb\" (UniqueName: \"kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.270035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.270058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxzw\" (UniqueName: \"kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw\") pod \"auto-csr-approver-29533410-8sgtm\" (UID: \"58920cad-4515-427a-a4fe-1050a058a462\") " pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.270162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.371518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.371930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxzw\" (UniqueName: \"kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw\") pod \"auto-csr-approver-29533410-8sgtm\" (UID: \"58920cad-4515-427a-a4fe-1050a058a462\") " pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.372117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.372302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflqb\" (UniqueName: \"kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.372389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.380435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.391792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflqb\" (UniqueName: \"kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb\") pod \"collect-profiles-29533410-5blcd\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.394217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxzw\" (UniqueName: \"kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw\") pod \"auto-csr-approver-29533410-8sgtm\" (UID: \"58920cad-4515-427a-a4fe-1050a058a462\") " pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.469421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.487702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.527501 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(786c85fa522e455cc4232bddfa6f71e7a289c63497cc42b9e1bc29b6de8b13a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.527614 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(786c85fa522e455cc4232bddfa6f71e7a289c63497cc42b9e1bc29b6de8b13a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.527653 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(786c85fa522e455cc4232bddfa6f71e7a289c63497cc42b9e1bc29b6de8b13a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.527722 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29533410-8sgtm_openshift-infra(58920cad-4515-427a-a4fe-1050a058a462)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29533410-8sgtm_openshift-infra(58920cad-4515-427a-a4fe-1050a058a462)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(786c85fa522e455cc4232bddfa6f71e7a289c63497cc42b9e1bc29b6de8b13a4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" podUID="58920cad-4515-427a-a4fe-1050a058a462" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.535845 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(ac886840d009e09e90a22d3dfd5088633b39f555fced02e183d382ee866cc806): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.535915 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(ac886840d009e09e90a22d3dfd5088633b39f555fced02e183d382ee866cc806): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.535943 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(ac886840d009e09e90a22d3dfd5088633b39f555fced02e183d382ee866cc806): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.535986 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager(def4ca88-4fde-45f4-a10f-b0bd66600b5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager(def4ca88-4fde-45f4-a10f-b0bd66600b5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(ac886840d009e09e90a22d3dfd5088633b39f555fced02e183d382ee866cc806): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.716739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.716788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.717380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: I0225 07:30:00.717568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.764161 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(1c1c3fb0bf0ce6d1ffce1ba341e22718611dbb927cab75d600f8237a42a9b2fa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.764251 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(1c1c3fb0bf0ce6d1ffce1ba341e22718611dbb927cab75d600f8237a42a9b2fa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.764286 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(1c1c3fb0bf0ce6d1ffce1ba341e22718611dbb927cab75d600f8237a42a9b2fa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.764357 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29533410-8sgtm_openshift-infra(58920cad-4515-427a-a4fe-1050a058a462)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29533410-8sgtm_openshift-infra(58920cad-4515-427a-a4fe-1050a058a462)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29533410-8sgtm_openshift-infra_58920cad-4515-427a-a4fe-1050a058a462_0(1c1c3fb0bf0ce6d1ffce1ba341e22718611dbb927cab75d600f8237a42a9b2fa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" podUID="58920cad-4515-427a-a4fe-1050a058a462" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.775925 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(4db5b3da5e4ffb10ce0ffe84a39edd4311f48b76ab9a2710e78b33613a273622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.776018 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(4db5b3da5e4ffb10ce0ffe84a39edd4311f48b76ab9a2710e78b33613a273622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.776051 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(4db5b3da5e4ffb10ce0ffe84a39edd4311f48b76ab9a2710e78b33613a273622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:00 crc kubenswrapper[4749]: E0225 07:30:00.776124 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager(def4ca88-4fde-45f4-a10f-b0bd66600b5e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager(def4ca88-4fde-45f4-a10f-b0bd66600b5e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29533410-5blcd_openshift-operator-lifecycle-manager_def4ca88-4fde-45f4-a10f-b0bd66600b5e_0(4db5b3da5e4ffb10ce0ffe84a39edd4311f48b76ab9a2710e78b33613a273622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" Feb 25 07:30:12 crc kubenswrapper[4749]: I0225 07:30:12.323067 4749 scope.go:117] "RemoveContainer" containerID="87db0ec8d89f19b24bddf06d388734a22c6f9f31f5d88a5bf9ccea568e2bac88" Feb 25 07:30:12 crc kubenswrapper[4749]: I0225 07:30:12.816861 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bkmjf_21c23d4e-91a8-4374-84dc-7bdc7450661d/kube-multus/2.log" Feb 25 07:30:12 crc kubenswrapper[4749]: I0225 07:30:12.817186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bkmjf" event={"ID":"21c23d4e-91a8-4374-84dc-7bdc7450661d","Type":"ContainerStarted","Data":"09ddef2a8ef6f9c2c80266e2b1386b73dfa0186093b37855f2d33c95725bad4e"} Feb 25 07:30:14 crc kubenswrapper[4749]: I0225 07:30:14.321417 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:14 crc kubenswrapper[4749]: I0225 07:30:14.322099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:14 crc kubenswrapper[4749]: I0225 07:30:14.405003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgv58" Feb 25 07:30:14 crc kubenswrapper[4749]: I0225 07:30:14.588903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533410-8sgtm"] Feb 25 07:30:14 crc kubenswrapper[4749]: I0225 07:30:14.829637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" event={"ID":"58920cad-4515-427a-a4fe-1050a058a462","Type":"ContainerStarted","Data":"92ffb10c59a62bb192d1af6f22057a25fac6ee0d283b6b9f8aabcf2f6f94b7d3"} Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.322775 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.323308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.568268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd"] Feb 25 07:30:15 crc kubenswrapper[4749]: W0225 07:30:15.581495 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef4ca88_4fde_45f4_a10f_b0bd66600b5e.slice/crio-4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738 WatchSource:0}: Error finding container 4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738: Status 404 returned error can't find the container with id 4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738 Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.836577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" event={"ID":"def4ca88-4fde-45f4-a10f-b0bd66600b5e","Type":"ContainerStarted","Data":"bb727634e9aedd0d8f24eb29fe19932476e05f109970239dd6485a974a293554"} Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.837077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" event={"ID":"def4ca88-4fde-45f4-a10f-b0bd66600b5e","Type":"ContainerStarted","Data":"4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738"} Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.838501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" event={"ID":"58920cad-4515-427a-a4fe-1050a058a462","Type":"ContainerStarted","Data":"a9f1ec3358e14af7d5980d4c073cac917011a29d8978a4ff49c60255e1f6cfe6"} Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.879557 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" podStartSLOduration=15.004984201 podStartE2EDuration="15.879537549s" podCreationTimestamp="2026-02-25 07:30:00 +0000 UTC" firstStartedPulling="2026-02-25 07:30:14.603295144 +0000 UTC m=+767.965121174" lastFinishedPulling="2026-02-25 07:30:15.477848502 +0000 UTC m=+768.839674522" observedRunningTime="2026-02-25 07:30:15.878645137 +0000 UTC m=+769.240471157" watchObservedRunningTime="2026-02-25 07:30:15.879537549 +0000 UTC m=+769.241363569" Feb 25 07:30:15 crc kubenswrapper[4749]: I0225 07:30:15.884116 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" podStartSLOduration=15.884098309 podStartE2EDuration="15.884098309s" podCreationTimestamp="2026-02-25 07:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:30:15.861204366 +0000 UTC m=+769.223030396" watchObservedRunningTime="2026-02-25 07:30:15.884098309 +0000 UTC m=+769.245924329" Feb 25 07:30:16 crc kubenswrapper[4749]: I0225 07:30:16.851004 4749 generic.go:334] "Generic (PLEG): container finished" podID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" containerID="bb727634e9aedd0d8f24eb29fe19932476e05f109970239dd6485a974a293554" exitCode=0 Feb 25 07:30:16 crc kubenswrapper[4749]: I0225 07:30:16.851088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" event={"ID":"def4ca88-4fde-45f4-a10f-b0bd66600b5e","Type":"ContainerDied","Data":"bb727634e9aedd0d8f24eb29fe19932476e05f109970239dd6485a974a293554"} Feb 25 07:30:16 crc kubenswrapper[4749]: I0225 07:30:16.853576 4749 generic.go:334] "Generic (PLEG): container finished" podID="58920cad-4515-427a-a4fe-1050a058a462" containerID="a9f1ec3358e14af7d5980d4c073cac917011a29d8978a4ff49c60255e1f6cfe6" exitCode=0 Feb 25 07:30:16 crc kubenswrapper[4749]: I0225 07:30:16.853692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" event={"ID":"58920cad-4515-427a-a4fe-1050a058a462","Type":"ContainerDied","Data":"a9f1ec3358e14af7d5980d4c073cac917011a29d8978a4ff49c60255e1f6cfe6"} Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.870235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" event={"ID":"def4ca88-4fde-45f4-a10f-b0bd66600b5e","Type":"ContainerDied","Data":"4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738"} Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.870303 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4007466d9ad37adb3b8ab7a2924c1a7d2983de1cfeb542d9049fadd0b1361738" Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.871749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" event={"ID":"58920cad-4515-427a-a4fe-1050a058a462","Type":"ContainerDied","Data":"92ffb10c59a62bb192d1af6f22057a25fac6ee0d283b6b9f8aabcf2f6f94b7d3"} Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.871800 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ffb10c59a62bb192d1af6f22057a25fac6ee0d283b6b9f8aabcf2f6f94b7d3" Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.915799 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:18 crc kubenswrapper[4749]: I0225 07:30:18.923073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.037931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lxzw\" (UniqueName: \"kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw\") pod \"58920cad-4515-427a-a4fe-1050a058a462\" (UID: \"58920cad-4515-427a-a4fe-1050a058a462\") " Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.038007 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume\") pod \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.038057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pflqb\" (UniqueName: \"kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb\") pod \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.038080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume\") pod \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\" (UID: \"def4ca88-4fde-45f4-a10f-b0bd66600b5e\") " Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.038875 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume" (OuterVolumeSpecName: "config-volume") pod "def4ca88-4fde-45f4-a10f-b0bd66600b5e" (UID: "def4ca88-4fde-45f4-a10f-b0bd66600b5e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.043351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb" (OuterVolumeSpecName: "kube-api-access-pflqb") pod "def4ca88-4fde-45f4-a10f-b0bd66600b5e" (UID: "def4ca88-4fde-45f4-a10f-b0bd66600b5e"). InnerVolumeSpecName "kube-api-access-pflqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.043486 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "def4ca88-4fde-45f4-a10f-b0bd66600b5e" (UID: "def4ca88-4fde-45f4-a10f-b0bd66600b5e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.044089 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw" (OuterVolumeSpecName: "kube-api-access-2lxzw") pod "58920cad-4515-427a-a4fe-1050a058a462" (UID: "58920cad-4515-427a-a4fe-1050a058a462"). InnerVolumeSpecName "kube-api-access-2lxzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.139330 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lxzw\" (UniqueName: \"kubernetes.io/projected/58920cad-4515-427a-a4fe-1050a058a462-kube-api-access-2lxzw\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.139370 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/def4ca88-4fde-45f4-a10f-b0bd66600b5e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.139384 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pflqb\" (UniqueName: \"kubernetes.io/projected/def4ca88-4fde-45f4-a10f-b0bd66600b5e-kube-api-access-pflqb\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.139395 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def4ca88-4fde-45f4-a10f-b0bd66600b5e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.880743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533410-8sgtm" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.880873 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd" Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.981288 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533404-5d6x5"] Feb 25 07:30:19 crc kubenswrapper[4749]: I0225 07:30:19.987453 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533404-5d6x5"] Feb 25 07:30:21 crc kubenswrapper[4749]: I0225 07:30:21.330095 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11de3f9-29e5-4785-8769-3e1ea90ab4d9" path="/var/lib/kubelet/pods/f11de3f9-29e5-4785-8769-3e1ea90ab4d9/volumes" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.698535 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq"] Feb 25 07:30:30 crc kubenswrapper[4749]: E0225 07:30:30.699335 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58920cad-4515-427a-a4fe-1050a058a462" containerName="oc" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.699350 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58920cad-4515-427a-a4fe-1050a058a462" containerName="oc" Feb 25 07:30:30 crc kubenswrapper[4749]: E0225 07:30:30.699370 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" containerName="collect-profiles" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.699379 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" containerName="collect-profiles" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.699506 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58920cad-4515-427a-a4fe-1050a058a462" containerName="oc" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.699523 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" containerName="collect-profiles" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.700535 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.705003 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.708462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.708514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.708580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ztg\" (UniqueName: \"kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.709156 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq"] Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.809582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.809664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.809925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ztg\" (UniqueName: \"kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.810257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.810449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:30 crc kubenswrapper[4749]: I0225 07:30:30.837239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ztg\" (UniqueName: \"kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:31 crc kubenswrapper[4749]: I0225 07:30:31.022255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:31 crc kubenswrapper[4749]: I0225 07:30:31.318090 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq"] Feb 25 07:30:31 crc kubenswrapper[4749]: I0225 07:30:31.972721 4749 generic.go:334] "Generic (PLEG): container finished" podID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerID="317e9fd42fd88ed2fb16dda3f282c583004ae4ea64a3825971319d3b699b4f84" exitCode=0 Feb 25 07:30:31 crc kubenswrapper[4749]: I0225 07:30:31.972776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" event={"ID":"f03e9291-ddbb-4b8a-b96c-c66a604694d2","Type":"ContainerDied","Data":"317e9fd42fd88ed2fb16dda3f282c583004ae4ea64a3825971319d3b699b4f84"} Feb 25 07:30:31 crc kubenswrapper[4749]: I0225 07:30:31.972813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" event={"ID":"f03e9291-ddbb-4b8a-b96c-c66a604694d2","Type":"ContainerStarted","Data":"1819cf432bbc9e2a8e15eb4f346018f578ad221d3c037d243a70b1488c92ec24"} Feb 25 07:30:33 crc kubenswrapper[4749]: I0225 07:30:33.989833 4749 generic.go:334] "Generic (PLEG): container finished" podID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerID="8ea1f6f5b33f997ac36ae9ff4fec82a7560604abbb02495d3dd2c629aab0a08d" exitCode=0 Feb 25 07:30:33 crc kubenswrapper[4749]: I0225 07:30:33.989966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" event={"ID":"f03e9291-ddbb-4b8a-b96c-c66a604694d2","Type":"ContainerDied","Data":"8ea1f6f5b33f997ac36ae9ff4fec82a7560604abbb02495d3dd2c629aab0a08d"} Feb 25 07:30:34 crc kubenswrapper[4749]: I0225 07:30:34.433399 4749 scope.go:117] "RemoveContainer" containerID="c533acd952e9dbc20cd9cac9154998e45aaeb53c83c30768a91c3f094ffde5b8" Feb 25 07:30:35 crc kubenswrapper[4749]: I0225 07:30:35.001017 4749 generic.go:334] "Generic (PLEG): container finished" podID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerID="266abfa218cb589d9ccae181951ce67cb1ffd91fb9a1188dd02cb02e6f02eae5" exitCode=0 Feb 25 07:30:35 crc kubenswrapper[4749]: I0225 07:30:35.001086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" event={"ID":"f03e9291-ddbb-4b8a-b96c-c66a604694d2","Type":"ContainerDied","Data":"266abfa218cb589d9ccae181951ce67cb1ffd91fb9a1188dd02cb02e6f02eae5"} Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.336096 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.488485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle\") pod \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.488588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util\") pod \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.488629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8ztg\" (UniqueName: \"kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg\") pod \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\" (UID: \"f03e9291-ddbb-4b8a-b96c-c66a604694d2\") " Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.490239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle" (OuterVolumeSpecName: "bundle") pod "f03e9291-ddbb-4b8a-b96c-c66a604694d2" (UID: "f03e9291-ddbb-4b8a-b96c-c66a604694d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.500821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg" (OuterVolumeSpecName: "kube-api-access-c8ztg") pod "f03e9291-ddbb-4b8a-b96c-c66a604694d2" (UID: "f03e9291-ddbb-4b8a-b96c-c66a604694d2"). InnerVolumeSpecName "kube-api-access-c8ztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.518339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util" (OuterVolumeSpecName: "util") pod "f03e9291-ddbb-4b8a-b96c-c66a604694d2" (UID: "f03e9291-ddbb-4b8a-b96c-c66a604694d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.589387 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-util\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.589415 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8ztg\" (UniqueName: \"kubernetes.io/projected/f03e9291-ddbb-4b8a-b96c-c66a604694d2-kube-api-access-c8ztg\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:36 crc kubenswrapper[4749]: I0225 07:30:36.589426 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f03e9291-ddbb-4b8a-b96c-c66a604694d2-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:30:37 crc kubenswrapper[4749]: I0225 07:30:37.016177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" event={"ID":"f03e9291-ddbb-4b8a-b96c-c66a604694d2","Type":"ContainerDied","Data":"1819cf432bbc9e2a8e15eb4f346018f578ad221d3c037d243a70b1488c92ec24"} Feb 25 07:30:37 crc kubenswrapper[4749]: I0225 07:30:37.016233 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1819cf432bbc9e2a8e15eb4f346018f578ad221d3c037d243a70b1488c92ec24" Feb 25 07:30:37 crc kubenswrapper[4749]: I0225 07:30:37.016287 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.268827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-tpddj"] Feb 25 07:30:42 crc kubenswrapper[4749]: E0225 07:30:42.269564 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="extract" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.269581 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="extract" Feb 25 07:30:42 crc kubenswrapper[4749]: E0225 07:30:42.269614 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="util" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.269622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="util" Feb 25 07:30:42 crc kubenswrapper[4749]: E0225 07:30:42.269648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="pull" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.269657 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="pull" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.269771 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03e9291-ddbb-4b8a-b96c-c66a604694d2" containerName="extract" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.270182 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.273924 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.274380 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9x4x4" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.276852 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.290456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-tpddj"] Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.374427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwktx\" (UniqueName: \"kubernetes.io/projected/956da30d-e0b4-45a1-a3b4-773cdada4e30-kube-api-access-rwktx\") pod \"nmstate-operator-694c9596b7-tpddj\" (UID: \"956da30d-e0b4-45a1-a3b4-773cdada4e30\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.476001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwktx\" (UniqueName: \"kubernetes.io/projected/956da30d-e0b4-45a1-a3b4-773cdada4e30-kube-api-access-rwktx\") pod \"nmstate-operator-694c9596b7-tpddj\" (UID: \"956da30d-e0b4-45a1-a3b4-773cdada4e30\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.509115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwktx\" (UniqueName: \"kubernetes.io/projected/956da30d-e0b4-45a1-a3b4-773cdada4e30-kube-api-access-rwktx\") pod \"nmstate-operator-694c9596b7-tpddj\" (UID: \"956da30d-e0b4-45a1-a3b4-773cdada4e30\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.587004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" Feb 25 07:30:42 crc kubenswrapper[4749]: I0225 07:30:42.813506 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-tpddj"] Feb 25 07:30:43 crc kubenswrapper[4749]: I0225 07:30:43.070385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" event={"ID":"956da30d-e0b4-45a1-a3b4-773cdada4e30","Type":"ContainerStarted","Data":"bcfde09b09174de79d99e3f50e72eafd7beac5422c094aaf9a3685ea3551f67d"} Feb 25 07:30:46 crc kubenswrapper[4749]: I0225 07:30:46.093113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" event={"ID":"956da30d-e0b4-45a1-a3b4-773cdada4e30","Type":"ContainerStarted","Data":"f254775653246090ad0e57525c1ae547735d4e6269623aee002d73caa3d5d4e1"} Feb 25 07:30:46 crc kubenswrapper[4749]: I0225 07:30:46.128109 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-tpddj" podStartSLOduration=1.5828592000000001 podStartE2EDuration="4.128075043s" podCreationTimestamp="2026-02-25 07:30:42 +0000 UTC" firstStartedPulling="2026-02-25 07:30:42.823823098 +0000 UTC m=+796.185649118" lastFinishedPulling="2026-02-25 07:30:45.369038931 +0000 UTC m=+798.730864961" observedRunningTime="2026-02-25 07:30:46.115921659 +0000 UTC m=+799.477747769" watchObservedRunningTime="2026-02-25 07:30:46.128075043 +0000 UTC m=+799.489901103" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.052132 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.057768 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.062826 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.063142 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4lvnk" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.078561 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.079413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.089780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.089876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99lfj\" (UniqueName: \"kubernetes.io/projected/ed80efda-7ad1-4992-9d76-f94d50e57216-kube-api-access-99lfj\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.098440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.116892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.120797 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xpxjw"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.121564 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgdpn\" (UniqueName: \"kubernetes.io/projected/eef1bf93-8a75-4ff2-b172-7601b6861aef-kube-api-access-dgdpn\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-ovs-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-dbus-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjfs9\" (UniqueName: \"kubernetes.io/projected/7ba6588d-93c3-481d-8606-1b91fee0267a-kube-api-access-mjfs9\") pod \"nmstate-metrics-58c85c668d-wnqc6\" (UID: \"7ba6588d-93c3-481d-8606-1b91fee0267a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99lfj\" (UniqueName: \"kubernetes.io/projected/ed80efda-7ad1-4992-9d76-f94d50e57216-kube-api-access-99lfj\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: E0225 07:30:51.191435 4749 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 25 07:30:51 crc kubenswrapper[4749]: E0225 07:30:51.191514 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair podName:ed80efda-7ad1-4992-9d76-f94d50e57216 nodeName:}" failed. No retries permitted until 2026-02-25 07:30:51.691496406 +0000 UTC m=+805.053322426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair") pod "nmstate-webhook-866bcb46dc-6flq6" (UID: "ed80efda-7ad1-4992-9d76-f94d50e57216") : secret "openshift-nmstate-webhook" not found Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.191453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-nmstate-lock\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.242094 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.243024 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.248470 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.248538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99lfj\" (UniqueName: \"kubernetes.io/projected/ed80efda-7ad1-4992-9d76-f94d50e57216-kube-api-access-99lfj\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.248729 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.248932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rgf6b" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.251730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.292802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b819184-1695-4312-a0f4-0e0bad53a7d7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jczl\" (UniqueName: \"kubernetes.io/projected/3b819184-1695-4312-a0f4-0e0bad53a7d7-kube-api-access-7jczl\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293716 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-ovs-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-dbus-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjfs9\" (UniqueName: \"kubernetes.io/projected/7ba6588d-93c3-481d-8606-1b91fee0267a-kube-api-access-mjfs9\") pod \"nmstate-metrics-58c85c668d-wnqc6\" (UID: \"7ba6588d-93c3-481d-8606-1b91fee0267a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-nmstate-lock\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.293853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-ovs-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.294192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.294241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgdpn\" (UniqueName: \"kubernetes.io/projected/eef1bf93-8a75-4ff2-b172-7601b6861aef-kube-api-access-dgdpn\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.294718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-nmstate-lock\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.294932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eef1bf93-8a75-4ff2-b172-7601b6861aef-dbus-socket\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.311047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjfs9\" (UniqueName: \"kubernetes.io/projected/7ba6588d-93c3-481d-8606-1b91fee0267a-kube-api-access-mjfs9\") pod \"nmstate-metrics-58c85c668d-wnqc6\" (UID: \"7ba6588d-93c3-481d-8606-1b91fee0267a\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.321474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgdpn\" (UniqueName: \"kubernetes.io/projected/eef1bf93-8a75-4ff2-b172-7601b6861aef-kube-api-access-dgdpn\") pod \"nmstate-handler-xpxjw\" (UID: \"eef1bf93-8a75-4ff2-b172-7601b6861aef\") " pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.382776 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-745879c58-kkt99"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.383436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.395382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.395427 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b819184-1695-4312-a0f4-0e0bad53a7d7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.395447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jczl\" (UniqueName: \"kubernetes.io/projected/3b819184-1695-4312-a0f4-0e0bad53a7d7-kube-api-access-7jczl\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: E0225 07:30:51.395607 4749 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 25 07:30:51 crc kubenswrapper[4749]: E0225 07:30:51.395664 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert podName:3b819184-1695-4312-a0f4-0e0bad53a7d7 nodeName:}" failed. No retries permitted until 2026-02-25 07:30:51.895647605 +0000 UTC m=+805.257473625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-gvgqz" (UID: "3b819184-1695-4312-a0f4-0e0bad53a7d7") : secret "plugin-serving-cert" not found Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.396508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3b819184-1695-4312-a0f4-0e0bad53a7d7-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.399197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-745879c58-kkt99"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.399430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.417584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jczl\" (UniqueName: \"kubernetes.io/projected/3b819184-1695-4312-a0f4-0e0bad53a7d7-kube-api-access-7jczl\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.441071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:51 crc kubenswrapper[4749]: W0225 07:30:51.464811 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef1bf93_8a75_4ff2_b172_7601b6861aef.slice/crio-0ae8131cccf2a5764d267aa57497625180fbd91facc70227574f72d9dcaa0e6a WatchSource:0}: Error finding container 0ae8131cccf2a5764d267aa57497625180fbd91facc70227574f72d9dcaa0e6a: Status 404 returned error can't find the container with id 0ae8131cccf2a5764d267aa57497625180fbd91facc70227574f72d9dcaa0e6a Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497005 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-oauth-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497350 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsqh\" (UniqueName: \"kubernetes.io/projected/713bbb63-227f-43b4-996e-09d5351263b4-kube-api-access-vwsqh\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-oauth-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-trusted-ca-bundle\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-service-ca\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-console-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.497821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.580527 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6"] Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-oauth-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsqh\" (UniqueName: \"kubernetes.io/projected/713bbb63-227f-43b4-996e-09d5351263b4-kube-api-access-vwsqh\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-oauth-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-trusted-ca-bundle\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-service-ca\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-console-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.599530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.600574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-oauth-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.600806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-service-ca\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.600820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-console-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.601301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/713bbb63-227f-43b4-996e-09d5351263b4-trusted-ca-bundle\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.606384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-serving-cert\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.606570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/713bbb63-227f-43b4-996e-09d5351263b4-console-oauth-config\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.618218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsqh\" (UniqueName: \"kubernetes.io/projected/713bbb63-227f-43b4-996e-09d5351263b4-kube-api-access-vwsqh\") pod \"console-745879c58-kkt99\" (UID: \"713bbb63-227f-43b4-996e-09d5351263b4\") " pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.671477 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.671577 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.699246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.700483 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.705405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed80efda-7ad1-4992-9d76-f94d50e57216-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-6flq6\" (UID: \"ed80efda-7ad1-4992-9d76-f94d50e57216\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.903393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.907483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b819184-1695-4312-a0f4-0e0bad53a7d7-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-gvgqz\" (UID: \"3b819184-1695-4312-a0f4-0e0bad53a7d7\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.933796 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-745879c58-kkt99"] Feb 25 07:30:51 crc kubenswrapper[4749]: W0225 07:30:51.936679 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod713bbb63_227f_43b4_996e_09d5351263b4.slice/crio-64a4e46f06f70325f235324718775c4180098d1033567744bff238a80ec2a5ff WatchSource:0}: Error finding container 64a4e46f06f70325f235324718775c4180098d1033567744bff238a80ec2a5ff: Status 404 returned error can't find the container with id 64a4e46f06f70325f235324718775c4180098d1033567744bff238a80ec2a5ff Feb 25 07:30:51 crc kubenswrapper[4749]: I0225 07:30:51.984038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.127209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xpxjw" event={"ID":"eef1bf93-8a75-4ff2-b172-7601b6861aef","Type":"ContainerStarted","Data":"0ae8131cccf2a5764d267aa57497625180fbd91facc70227574f72d9dcaa0e6a"} Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.128812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-745879c58-kkt99" event={"ID":"713bbb63-227f-43b4-996e-09d5351263b4","Type":"ContainerStarted","Data":"c0d903fafb7de67b6c0f26d27023f051323819c3266f868b1a9d8f1ded821d21"} Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.128861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-745879c58-kkt99" event={"ID":"713bbb63-227f-43b4-996e-09d5351263b4","Type":"ContainerStarted","Data":"64a4e46f06f70325f235324718775c4180098d1033567744bff238a80ec2a5ff"} Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.130797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" event={"ID":"7ba6588d-93c3-481d-8606-1b91fee0267a","Type":"ContainerStarted","Data":"d64027d7219b382424dda760e6d507a152db0927673743096afddda95ea6c9fc"} Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.152687 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-745879c58-kkt99" podStartSLOduration=1.152665249 podStartE2EDuration="1.152665249s" podCreationTimestamp="2026-02-25 07:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:30:52.150546847 +0000 UTC m=+805.512372867" watchObservedRunningTime="2026-02-25 07:30:52.152665249 +0000 UTC m=+805.514491279" Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.177571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.250711 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6"] Feb 25 07:30:52 crc kubenswrapper[4749]: W0225 07:30:52.261186 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded80efda_7ad1_4992_9d76_f94d50e57216.slice/crio-46aec7cf5bd733bda1f08698f565fd83f240f060cff246b94dc230113c4b7179 WatchSource:0}: Error finding container 46aec7cf5bd733bda1f08698f565fd83f240f060cff246b94dc230113c4b7179: Status 404 returned error can't find the container with id 46aec7cf5bd733bda1f08698f565fd83f240f060cff246b94dc230113c4b7179 Feb 25 07:30:52 crc kubenswrapper[4749]: I0225 07:30:52.429053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz"] Feb 25 07:30:53 crc kubenswrapper[4749]: I0225 07:30:53.138472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" event={"ID":"3b819184-1695-4312-a0f4-0e0bad53a7d7","Type":"ContainerStarted","Data":"3d0407f77971ec8206360dbf917ca937672b1ff48597aa751ade0a74d2190cd1"} Feb 25 07:30:53 crc kubenswrapper[4749]: I0225 07:30:53.140817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" event={"ID":"ed80efda-7ad1-4992-9d76-f94d50e57216","Type":"ContainerStarted","Data":"46aec7cf5bd733bda1f08698f565fd83f240f060cff246b94dc230113c4b7179"} Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.150973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xpxjw" event={"ID":"eef1bf93-8a75-4ff2-b172-7601b6861aef","Type":"ContainerStarted","Data":"2781a6a8b8ec548e3ed99ffb90f9bcdd2aa230404d33a016490709db06000db5"} Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.151196 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.158908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" event={"ID":"7ba6588d-93c3-481d-8606-1b91fee0267a","Type":"ContainerStarted","Data":"85fdba2d70732e7abc783b156856a81bccf58566393a039a705ca5f7650a30e8"} Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.163230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" event={"ID":"ed80efda-7ad1-4992-9d76-f94d50e57216","Type":"ContainerStarted","Data":"0fbfd1425951b0ee0649d21d277b3329e553090ac356d36b1f751c28a2700699"} Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.163924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.204320 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xpxjw" podStartSLOduration=0.927596564 podStartE2EDuration="3.204303301s" podCreationTimestamp="2026-02-25 07:30:51 +0000 UTC" firstStartedPulling="2026-02-25 07:30:51.467064973 +0000 UTC m=+804.828890993" lastFinishedPulling="2026-02-25 07:30:53.74377172 +0000 UTC m=+807.105597730" observedRunningTime="2026-02-25 07:30:54.182840282 +0000 UTC m=+807.544666292" watchObservedRunningTime="2026-02-25 07:30:54.204303301 +0000 UTC m=+807.566129311" Feb 25 07:30:54 crc kubenswrapper[4749]: I0225 07:30:54.206096 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" podStartSLOduration=1.747603961 podStartE2EDuration="3.206088144s" podCreationTimestamp="2026-02-25 07:30:51 +0000 UTC" firstStartedPulling="2026-02-25 07:30:52.26349119 +0000 UTC m=+805.625317210" lastFinishedPulling="2026-02-25 07:30:53.721975373 +0000 UTC m=+807.083801393" observedRunningTime="2026-02-25 07:30:54.199983657 +0000 UTC m=+807.561809697" watchObservedRunningTime="2026-02-25 07:30:54.206088144 +0000 UTC m=+807.567914164" Feb 25 07:30:55 crc kubenswrapper[4749]: I0225 07:30:55.174839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" event={"ID":"3b819184-1695-4312-a0f4-0e0bad53a7d7","Type":"ContainerStarted","Data":"7bebe9f55f638ad190a0e81573bd835ef01352e70c4ff122f8e068eb34893a54"} Feb 25 07:30:57 crc kubenswrapper[4749]: I0225 07:30:57.195495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" event={"ID":"7ba6588d-93c3-481d-8606-1b91fee0267a","Type":"ContainerStarted","Data":"23e5037b5dd6aca6c38e694820a890f4209e821b143ab70f5242914887bf8446"} Feb 25 07:30:57 crc kubenswrapper[4749]: I0225 07:30:57.218070 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-gvgqz" podStartSLOduration=3.969408119 podStartE2EDuration="6.218037078s" podCreationTimestamp="2026-02-25 07:30:51 +0000 UTC" firstStartedPulling="2026-02-25 07:30:52.436830413 +0000 UTC m=+805.798656433" lastFinishedPulling="2026-02-25 07:30:54.685459372 +0000 UTC m=+808.047285392" observedRunningTime="2026-02-25 07:30:55.193638935 +0000 UTC m=+808.555464955" watchObservedRunningTime="2026-02-25 07:30:57.218037078 +0000 UTC m=+810.579863138" Feb 25 07:31:01 crc kubenswrapper[4749]: I0225 07:31:01.473461 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xpxjw" Feb 25 07:31:01 crc kubenswrapper[4749]: I0225 07:31:01.490989 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-wnqc6" podStartSLOduration=6.036213939 podStartE2EDuration="10.490971657s" podCreationTimestamp="2026-02-25 07:30:51 +0000 UTC" firstStartedPulling="2026-02-25 07:30:51.594335502 +0000 UTC m=+804.956161522" lastFinishedPulling="2026-02-25 07:30:56.04909319 +0000 UTC m=+809.410919240" observedRunningTime="2026-02-25 07:30:57.212003773 +0000 UTC m=+810.573829833" watchObservedRunningTime="2026-02-25 07:31:01.490971657 +0000 UTC m=+814.852797677" Feb 25 07:31:01 crc kubenswrapper[4749]: I0225 07:31:01.699702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:31:01 crc kubenswrapper[4749]: I0225 07:31:01.700105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:31:01 crc kubenswrapper[4749]: I0225 07:31:01.708867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:31:02 crc kubenswrapper[4749]: I0225 07:31:02.248537 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-745879c58-kkt99" Feb 25 07:31:02 crc kubenswrapper[4749]: I0225 07:31:02.345710 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:31:06 crc kubenswrapper[4749]: I0225 07:31:06.755580 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 25 07:31:11 crc kubenswrapper[4749]: I0225 07:31:11.994354 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-6flq6" Feb 25 07:31:21 crc kubenswrapper[4749]: I0225 07:31:21.672142 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:31:21 crc kubenswrapper[4749]: I0225 07:31:21.673039 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.044861 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr"] Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.047275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.052126 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.060922 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr"] Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.244291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5zsw\" (UniqueName: \"kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.244338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.244373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.345470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5zsw\" (UniqueName: \"kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.345560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.345786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.346274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.346359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.377450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5zsw\" (UniqueName: \"kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.426265 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wmg6x" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" containerID="cri-o://e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a" gracePeriod=15 Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.683807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.692107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.758456 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wmg6x_6943031e-49a1-441a-a659-579d68c5879a/console/0.log" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.758964 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.897343 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr"] Feb 25 07:31:27 crc kubenswrapper[4749]: W0225 07:31:27.901986 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64fb5ac6_97f4_4e82_9dea_8d439e3f1fd1.slice/crio-fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1 WatchSource:0}: Error finding container fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1: Status 404 returned error can't find the container with id fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1 Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.952579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcgh6\" (UniqueName: \"kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6\") pod \"6943031e-49a1-441a-a659-579d68c5879a\" (UID: \"6943031e-49a1-441a-a659-579d68c5879a\") " Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.953185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca" (OuterVolumeSpecName: "service-ca") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.953197 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config" (OuterVolumeSpecName: "console-config") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.953283 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.954791 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.957616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.957638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6" (OuterVolumeSpecName: "kube-api-access-rcgh6") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "kube-api-access-rcgh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:31:27 crc kubenswrapper[4749]: I0225 07:31:27.957885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6943031e-49a1-441a-a659-579d68c5879a" (UID: "6943031e-49a1-441a-a659-579d68c5879a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.053925 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054275 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054292 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6943031e-49a1-441a-a659-579d68c5879a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054304 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054316 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-service-ca\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054324 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6943031e-49a1-441a-a659-579d68c5879a-console-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.054332 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcgh6\" (UniqueName: \"kubernetes.io/projected/6943031e-49a1-441a-a659-579d68c5879a-kube-api-access-rcgh6\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wmg6x_6943031e-49a1-441a-a659-579d68c5879a/console/0.log" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440409 4749 generic.go:334] "Generic (PLEG): container finished" podID="6943031e-49a1-441a-a659-579d68c5879a" containerID="e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a" exitCode=2 Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440503 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wmg6x" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmg6x" event={"ID":"6943031e-49a1-441a-a659-579d68c5879a","Type":"ContainerDied","Data":"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a"} Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wmg6x" event={"ID":"6943031e-49a1-441a-a659-579d68c5879a","Type":"ContainerDied","Data":"dc7c54744a1379f9f41777ea133e56aa76904a82b7055d4ba09fa3200642dfd4"} Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.440680 4749 scope.go:117] "RemoveContainer" containerID="e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.442921 4749 generic.go:334] "Generic (PLEG): container finished" podID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerID="69a9bf7ba2228b2ef7faa351eb0e51189f13874f54a7cb354b7418209a0465f3" exitCode=0 Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.442992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" event={"ID":"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1","Type":"ContainerDied","Data":"69a9bf7ba2228b2ef7faa351eb0e51189f13874f54a7cb354b7418209a0465f3"} Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.443049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" event={"ID":"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1","Type":"ContainerStarted","Data":"fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1"} Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.445342 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.476277 4749 scope.go:117] "RemoveContainer" containerID="e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a" Feb 25 07:31:28 crc kubenswrapper[4749]: E0225 07:31:28.476820 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a\": container with ID starting with e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a not found: ID does not exist" containerID="e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.476860 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a"} err="failed to get container status \"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a\": rpc error: code = NotFound desc = could not find container \"e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a\": container with ID starting with e2504509a0aacd9a662c9f04362b50a21441de7662f2961392b5e2da3859599a not found: ID does not exist" Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.497723 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:31:28 crc kubenswrapper[4749]: I0225 07:31:28.504101 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wmg6x"] Feb 25 07:31:29 crc kubenswrapper[4749]: I0225 07:31:29.332391 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6943031e-49a1-441a-a659-579d68c5879a" path="/var/lib/kubelet/pods/6943031e-49a1-441a-a659-579d68c5879a/volumes" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.418026 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:31:30 crc kubenswrapper[4749]: E0225 07:31:30.418852 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.418889 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.419081 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6943031e-49a1-441a-a659-579d68c5879a" containerName="console" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.421337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.431177 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.461079 4749 generic.go:334] "Generic (PLEG): container finished" podID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerID="6edab81cb825288dff944cdcd70f5653b35909203ded28e253cb3da1b104045f" exitCode=0 Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.461149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" event={"ID":"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1","Type":"ContainerDied","Data":"6edab81cb825288dff944cdcd70f5653b35909203ded28e253cb3da1b104045f"} Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.594530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljd4n\" (UniqueName: \"kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.594585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.594712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.696237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.696374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.696410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljd4n\" (UniqueName: \"kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.696831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.697045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.723854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljd4n\" (UniqueName: \"kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n\") pod \"redhat-operators-nb4tl\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.743744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:30 crc kubenswrapper[4749]: I0225 07:31:30.957026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:31:31 crc kubenswrapper[4749]: I0225 07:31:31.467947 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9981155-4e11-4c94-a286-dcfc5972252b" containerID="63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a" exitCode=0 Feb 25 07:31:31 crc kubenswrapper[4749]: I0225 07:31:31.468038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerDied","Data":"63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a"} Feb 25 07:31:31 crc kubenswrapper[4749]: I0225 07:31:31.468270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerStarted","Data":"0d6f6f85c9ed6f9fbee590c358facebe71f4f6e19c13ab0d7225654a887380da"} Feb 25 07:31:31 crc kubenswrapper[4749]: I0225 07:31:31.471684 4749 generic.go:334] "Generic (PLEG): container finished" podID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerID="570a2c8dbc528c2b42c08c58abc4e0c8d2b7a09c941485670c8b861392341f61" exitCode=0 Feb 25 07:31:31 crc kubenswrapper[4749]: I0225 07:31:31.471718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" event={"ID":"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1","Type":"ContainerDied","Data":"570a2c8dbc528c2b42c08c58abc4e0c8d2b7a09c941485670c8b861392341f61"} Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.806432 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.929556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5zsw\" (UniqueName: \"kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw\") pod \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.930078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util\") pod \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.930112 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle\") pod \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\" (UID: \"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1\") " Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.931809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle" (OuterVolumeSpecName: "bundle") pod "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" (UID: "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:31:32 crc kubenswrapper[4749]: E0225 07:31:32.932187 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/c5/c5bb93121403c0d27942e6bdad2826c193f528a184e4e792a0e0ad29a930b4b3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260225%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260225T073131Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=49e05155ae7c2718d29dff55f9c28c2652b05fbbe5395efb9bc7d4e95f0e5dc5®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1772005591~hmac=4e4c5ebb39736cf806dc1168961413cbd3d345c4a21546261682e047e96c75b5\": remote error: tls: internal error" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 25 07:31:32 crc kubenswrapper[4749]: E0225 07:31:32.932353 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ljd4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nb4tl_openshift-marketplace(a9981155-4e11-4c94-a286-dcfc5972252b): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/c5/c5bb93121403c0d27942e6bdad2826c193f528a184e4e792a0e0ad29a930b4b3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260225%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260225T073131Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=49e05155ae7c2718d29dff55f9c28c2652b05fbbe5395efb9bc7d4e95f0e5dc5®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1772005591~hmac=4e4c5ebb39736cf806dc1168961413cbd3d345c4a21546261682e047e96c75b5\": remote error: tls: internal error" logger="UnhandledError" Feb 25 07:31:32 crc kubenswrapper[4749]: E0225 07:31:32.933798 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/c5/c5bb93121403c0d27942e6bdad2826c193f528a184e4e792a0e0ad29a930b4b3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260225%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260225T073131Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=49e05155ae7c2718d29dff55f9c28c2652b05fbbe5395efb9bc7d4e95f0e5dc5®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1772005591~hmac=4e4c5ebb39736cf806dc1168961413cbd3d345c4a21546261682e047e96c75b5\\\": remote error: tls: internal error\"" pod="openshift-marketplace/redhat-operators-nb4tl" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.940297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw" (OuterVolumeSpecName: "kube-api-access-k5zsw") pod "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" (UID: "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1"). InnerVolumeSpecName "kube-api-access-k5zsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:31:32 crc kubenswrapper[4749]: I0225 07:31:32.954935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util" (OuterVolumeSpecName: "util") pod "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" (UID: "64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.031916 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5zsw\" (UniqueName: \"kubernetes.io/projected/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-kube-api-access-k5zsw\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.031968 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-util\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.031994 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.489446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" event={"ID":"64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1","Type":"ContainerDied","Data":"fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1"} Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.489795 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc56022e237859d408141552e024395e321faf2d4f8ab54fd51e2a0d714018f1" Feb 25 07:31:33 crc kubenswrapper[4749]: I0225 07:31:33.489561 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr" Feb 25 07:31:33 crc kubenswrapper[4749]: E0225 07:31:33.493258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nb4tl" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:40.999986 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:41 crc kubenswrapper[4749]: E0225 07:31:41.000780 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="extract" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.000795 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="extract" Feb 25 07:31:41 crc kubenswrapper[4749]: E0225 07:31:41.000807 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="util" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.000814 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="util" Feb 25 07:31:41 crc kubenswrapper[4749]: E0225 07:31:41.000825 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="pull" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.000832 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="pull" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.000961 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1" containerName="extract" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.001907 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.019546 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.141279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.141349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rlkm\" (UniqueName: \"kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.141413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.242854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.242936 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rlkm\" (UniqueName: \"kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.243037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.243708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.243943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.270341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rlkm\" (UniqueName: \"kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm\") pod \"certified-operators-dz99s\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.322696 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.747565 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56887f7db6-72j45"] Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.748693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.755917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.756218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.756486 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.756557 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.756576 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7h4r2" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.771525 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56887f7db6-72j45"] Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.851995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wfsn\" (UniqueName: \"kubernetes.io/projected/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-kube-api-access-6wfsn\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.852057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-webhook-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.852104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-apiservice-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.906364 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.957849 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wfsn\" (UniqueName: \"kubernetes.io/projected/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-kube-api-access-6wfsn\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.957896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-webhook-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.957943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-apiservice-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.965409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-apiservice-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.973607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-webhook-cert\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:41 crc kubenswrapper[4749]: I0225 07:31:41.994128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wfsn\" (UniqueName: \"kubernetes.io/projected/3ae3ac00-e47a-4cc0-ba56-9f0f885163ca-kube-api-access-6wfsn\") pod \"metallb-operator-controller-manager-56887f7db6-72j45\" (UID: \"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca\") " pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.053053 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6"] Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.053736 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.057020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.060309 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qljbh" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.060415 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.064447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6"] Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.072342 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.160368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2j6\" (UniqueName: \"kubernetes.io/projected/aacb357e-4983-4cb4-86df-6fd3119e8b15-kube-api-access-dd2j6\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.160628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-webhook-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.160711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-apiservice-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.261782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-apiservice-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.261873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2j6\" (UniqueName: \"kubernetes.io/projected/aacb357e-4983-4cb4-86df-6fd3119e8b15-kube-api-access-dd2j6\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.261904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-webhook-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.267232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-webhook-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.267365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aacb357e-4983-4cb4-86df-6fd3119e8b15-apiservice-cert\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.278404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2j6\" (UniqueName: \"kubernetes.io/projected/aacb357e-4983-4cb4-86df-6fd3119e8b15-kube-api-access-dd2j6\") pod \"metallb-operator-webhook-server-96dfd7b56-f9wq6\" (UID: \"aacb357e-4983-4cb4-86df-6fd3119e8b15\") " pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.359201 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56887f7db6-72j45"] Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.367013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:42 crc kubenswrapper[4749]: W0225 07:31:42.373929 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae3ac00_e47a_4cc0_ba56_9f0f885163ca.slice/crio-80e880740a6d0ec48df610ea4b4f08546b44a3a5885a5bd101682a17a0751656 WatchSource:0}: Error finding container 80e880740a6d0ec48df610ea4b4f08546b44a3a5885a5bd101682a17a0751656: Status 404 returned error can't find the container with id 80e880740a6d0ec48df610ea4b4f08546b44a3a5885a5bd101682a17a0751656 Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.552099 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerID="d23f6f423aa258462429b8dfa8b54bfd5bbd68f02170d95df059c97a53fd5944" exitCode=0 Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.552405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerDied","Data":"d23f6f423aa258462429b8dfa8b54bfd5bbd68f02170d95df059c97a53fd5944"} Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.552437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerStarted","Data":"17391492a330fabd591bb0e574c9558e40a34c879b062057a9c4637f390dec95"} Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.563881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" event={"ID":"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca","Type":"ContainerStarted","Data":"80e880740a6d0ec48df610ea4b4f08546b44a3a5885a5bd101682a17a0751656"} Feb 25 07:31:42 crc kubenswrapper[4749]: I0225 07:31:42.628990 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6"] Feb 25 07:31:42 crc kubenswrapper[4749]: W0225 07:31:42.635226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacb357e_4983_4cb4_86df_6fd3119e8b15.slice/crio-b020fb8ed650e4308310aa3703aefad44771872c10a9d5ab498b98d68b210294 WatchSource:0}: Error finding container b020fb8ed650e4308310aa3703aefad44771872c10a9d5ab498b98d68b210294: Status 404 returned error can't find the container with id b020fb8ed650e4308310aa3703aefad44771872c10a9d5ab498b98d68b210294 Feb 25 07:31:43 crc kubenswrapper[4749]: I0225 07:31:43.577814 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerID="e07bbb08a02fa24a03d39efe692caa3a8c4d46431fea2304b1d840b28866cf80" exitCode=0 Feb 25 07:31:43 crc kubenswrapper[4749]: I0225 07:31:43.577902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerDied","Data":"e07bbb08a02fa24a03d39efe692caa3a8c4d46431fea2304b1d840b28866cf80"} Feb 25 07:31:43 crc kubenswrapper[4749]: I0225 07:31:43.580263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" event={"ID":"aacb357e-4983-4cb4-86df-6fd3119e8b15","Type":"ContainerStarted","Data":"b020fb8ed650e4308310aa3703aefad44771872c10a9d5ab498b98d68b210294"} Feb 25 07:31:44 crc kubenswrapper[4749]: I0225 07:31:44.592103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerStarted","Data":"dbc9e5064118b94004820a3281963a2849462487e095d45962d3e20b6bce00ee"} Feb 25 07:31:44 crc kubenswrapper[4749]: I0225 07:31:44.613169 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dz99s" podStartSLOduration=3.164932593 podStartE2EDuration="4.613147804s" podCreationTimestamp="2026-02-25 07:31:40 +0000 UTC" firstStartedPulling="2026-02-25 07:31:42.554730785 +0000 UTC m=+855.916556805" lastFinishedPulling="2026-02-25 07:31:44.002945996 +0000 UTC m=+857.364772016" observedRunningTime="2026-02-25 07:31:44.606852717 +0000 UTC m=+857.968678737" watchObservedRunningTime="2026-02-25 07:31:44.613147804 +0000 UTC m=+857.974973824" Feb 25 07:31:45 crc kubenswrapper[4749]: I0225 07:31:45.606229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" event={"ID":"3ae3ac00-e47a-4cc0-ba56-9f0f885163ca","Type":"ContainerStarted","Data":"f02c3e7b1f548e729c8dcc7b3992a18fff2d4a00467200a2de03d152f7812e62"} Feb 25 07:31:45 crc kubenswrapper[4749]: I0225 07:31:45.606448 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:31:45 crc kubenswrapper[4749]: I0225 07:31:45.635855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" podStartSLOduration=1.668130355 podStartE2EDuration="4.635838626s" podCreationTimestamp="2026-02-25 07:31:41 +0000 UTC" firstStartedPulling="2026-02-25 07:31:42.38015436 +0000 UTC m=+855.741980380" lastFinishedPulling="2026-02-25 07:31:45.347862631 +0000 UTC m=+858.709688651" observedRunningTime="2026-02-25 07:31:45.635261123 +0000 UTC m=+858.997087153" watchObservedRunningTime="2026-02-25 07:31:45.635838626 +0000 UTC m=+858.997664646" Feb 25 07:31:48 crc kubenswrapper[4749]: I0225 07:31:48.630655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" event={"ID":"aacb357e-4983-4cb4-86df-6fd3119e8b15","Type":"ContainerStarted","Data":"1a5fb9e3d67c451e382cda1bd226ff333fe46841e245f81e75eedd2dde503815"} Feb 25 07:31:48 crc kubenswrapper[4749]: I0225 07:31:48.631209 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:31:48 crc kubenswrapper[4749]: I0225 07:31:48.632894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerStarted","Data":"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419"} Feb 25 07:31:48 crc kubenswrapper[4749]: I0225 07:31:48.650023 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" podStartSLOduration=1.2941456009999999 podStartE2EDuration="6.650000829s" podCreationTimestamp="2026-02-25 07:31:42 +0000 UTC" firstStartedPulling="2026-02-25 07:31:42.638804942 +0000 UTC m=+856.000630962" lastFinishedPulling="2026-02-25 07:31:47.99466017 +0000 UTC m=+861.356486190" observedRunningTime="2026-02-25 07:31:48.64795642 +0000 UTC m=+862.009782450" watchObservedRunningTime="2026-02-25 07:31:48.650000829 +0000 UTC m=+862.011826849" Feb 25 07:31:49 crc kubenswrapper[4749]: I0225 07:31:49.641373 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9981155-4e11-4c94-a286-dcfc5972252b" containerID="adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419" exitCode=0 Feb 25 07:31:49 crc kubenswrapper[4749]: I0225 07:31:49.641452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerDied","Data":"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419"} Feb 25 07:31:50 crc kubenswrapper[4749]: I0225 07:31:50.649167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerStarted","Data":"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d"} Feb 25 07:31:50 crc kubenswrapper[4749]: I0225 07:31:50.671877 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nb4tl" podStartSLOduration=1.847594409 podStartE2EDuration="20.671854956s" podCreationTimestamp="2026-02-25 07:31:30 +0000 UTC" firstStartedPulling="2026-02-25 07:31:31.469583317 +0000 UTC m=+844.831409337" lastFinishedPulling="2026-02-25 07:31:50.293843864 +0000 UTC m=+863.655669884" observedRunningTime="2026-02-25 07:31:50.666339807 +0000 UTC m=+864.028165827" watchObservedRunningTime="2026-02-25 07:31:50.671854956 +0000 UTC m=+864.033680976" Feb 25 07:31:50 crc kubenswrapper[4749]: I0225 07:31:50.744545 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:50 crc kubenswrapper[4749]: I0225 07:31:50.744792 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.328705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.328745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.363305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.672223 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.672290 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.672353 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.673059 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.673115 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13" gracePeriod=600 Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.733487 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.777143 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nb4tl" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="registry-server" probeResult="failure" output=< Feb 25 07:31:51 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:31:51 crc kubenswrapper[4749]: > Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.810062 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.811411 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.825407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.996024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.996125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:51 crc kubenswrapper[4749]: I0225 07:31:51.996218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t962\" (UniqueName: \"kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.097064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.097147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t962\" (UniqueName: \"kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.097191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.097740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.097763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.124353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t962\" (UniqueName: \"kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962\") pod \"community-operators-hszqp\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.180051 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.661666 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13" exitCode=0 Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.661888 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13"} Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.661989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65"} Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.662007 4749 scope.go:117] "RemoveContainer" containerID="de8a22a0c875bf538179f8ec85b5f0d04fd3549c604ae517de2de5aad4140b04" Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.703918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:31:52 crc kubenswrapper[4749]: W0225 07:31:52.709521 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aea4381_c806_43f8_b557_58e02dc817ba.slice/crio-896f91e5efb1fd44778af576d289d4eca786b1a25bbb99bd46752772c8942980 WatchSource:0}: Error finding container 896f91e5efb1fd44778af576d289d4eca786b1a25bbb99bd46752772c8942980: Status 404 returned error can't find the container with id 896f91e5efb1fd44778af576d289d4eca786b1a25bbb99bd46752772c8942980 Feb 25 07:31:52 crc kubenswrapper[4749]: I0225 07:31:52.991969 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:53 crc kubenswrapper[4749]: I0225 07:31:53.674655 4749 generic.go:334] "Generic (PLEG): container finished" podID="5aea4381-c806-43f8-b557-58e02dc817ba" containerID="74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836" exitCode=0 Feb 25 07:31:53 crc kubenswrapper[4749]: I0225 07:31:53.674708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerDied","Data":"74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836"} Feb 25 07:31:53 crc kubenswrapper[4749]: I0225 07:31:53.675061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerStarted","Data":"896f91e5efb1fd44778af576d289d4eca786b1a25bbb99bd46752772c8942980"} Feb 25 07:31:53 crc kubenswrapper[4749]: I0225 07:31:53.675259 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dz99s" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="registry-server" containerID="cri-o://dbc9e5064118b94004820a3281963a2849462487e095d45962d3e20b6bce00ee" gracePeriod=2 Feb 25 07:31:56 crc kubenswrapper[4749]: I0225 07:31:56.692444 4749 generic.go:334] "Generic (PLEG): container finished" podID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerID="dbc9e5064118b94004820a3281963a2849462487e095d45962d3e20b6bce00ee" exitCode=0 Feb 25 07:31:56 crc kubenswrapper[4749]: I0225 07:31:56.692491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerDied","Data":"dbc9e5064118b94004820a3281963a2849462487e095d45962d3e20b6bce00ee"} Feb 25 07:31:56 crc kubenswrapper[4749]: I0225 07:31:56.695658 4749 generic.go:334] "Generic (PLEG): container finished" podID="5aea4381-c806-43f8-b557-58e02dc817ba" containerID="ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393" exitCode=0 Feb 25 07:31:56 crc kubenswrapper[4749]: I0225 07:31:56.695696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerDied","Data":"ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393"} Feb 25 07:31:56 crc kubenswrapper[4749]: I0225 07:31:56.906684 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.060243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rlkm\" (UniqueName: \"kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm\") pod \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.060348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content\") pod \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.060389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities\") pod \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\" (UID: \"e4a4795b-0ec9-4d3b-924f-cea3796f4c85\") " Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.061368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities" (OuterVolumeSpecName: "utilities") pod "e4a4795b-0ec9-4d3b-924f-cea3796f4c85" (UID: "e4a4795b-0ec9-4d3b-924f-cea3796f4c85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.069505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm" (OuterVolumeSpecName: "kube-api-access-2rlkm") pod "e4a4795b-0ec9-4d3b-924f-cea3796f4c85" (UID: "e4a4795b-0ec9-4d3b-924f-cea3796f4c85"). InnerVolumeSpecName "kube-api-access-2rlkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.137861 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4a4795b-0ec9-4d3b-924f-cea3796f4c85" (UID: "e4a4795b-0ec9-4d3b-924f-cea3796f4c85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.162105 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rlkm\" (UniqueName: \"kubernetes.io/projected/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-kube-api-access-2rlkm\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.162136 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.162145 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a4795b-0ec9-4d3b-924f-cea3796f4c85-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.704794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerStarted","Data":"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996"} Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.708849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz99s" event={"ID":"e4a4795b-0ec9-4d3b-924f-cea3796f4c85","Type":"ContainerDied","Data":"17391492a330fabd591bb0e574c9558e40a34c879b062057a9c4637f390dec95"} Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.708885 4749 scope.go:117] "RemoveContainer" containerID="dbc9e5064118b94004820a3281963a2849462487e095d45962d3e20b6bce00ee" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.711853 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz99s" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.734028 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hszqp" podStartSLOduration=3.347613302 podStartE2EDuration="6.734009471s" podCreationTimestamp="2026-02-25 07:31:51 +0000 UTC" firstStartedPulling="2026-02-25 07:31:53.677011158 +0000 UTC m=+867.038837178" lastFinishedPulling="2026-02-25 07:31:57.063407327 +0000 UTC m=+870.425233347" observedRunningTime="2026-02-25 07:31:57.728577304 +0000 UTC m=+871.090403324" watchObservedRunningTime="2026-02-25 07:31:57.734009471 +0000 UTC m=+871.095835481" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.747981 4749 scope.go:117] "RemoveContainer" containerID="e07bbb08a02fa24a03d39efe692caa3a8c4d46431fea2304b1d840b28866cf80" Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.749686 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.754538 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dz99s"] Feb 25 07:31:57 crc kubenswrapper[4749]: I0225 07:31:57.764345 4749 scope.go:117] "RemoveContainer" containerID="d23f6f423aa258462429b8dfa8b54bfd5bbd68f02170d95df059c97a53fd5944" Feb 25 07:31:59 crc kubenswrapper[4749]: I0225 07:31:59.328844 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" path="/var/lib/kubelet/pods/e4a4795b-0ec9-4d3b-924f-cea3796f4c85/volumes" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.137351 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533412-gwcm2"] Feb 25 07:32:00 crc kubenswrapper[4749]: E0225 07:32:00.137878 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="extract-utilities" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.137891 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="extract-utilities" Feb 25 07:32:00 crc kubenswrapper[4749]: E0225 07:32:00.137908 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="registry-server" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.137914 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="registry-server" Feb 25 07:32:00 crc kubenswrapper[4749]: E0225 07:32:00.137927 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="extract-content" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.137935 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="extract-content" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.138037 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a4795b-0ec9-4d3b-924f-cea3796f4c85" containerName="registry-server" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.138515 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.141475 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.141564 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.143571 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.147953 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533412-gwcm2"] Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.208199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkfq\" (UniqueName: \"kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq\") pod \"auto-csr-approver-29533412-gwcm2\" (UID: \"a73ed139-bbb8-49c0-a53d-b09855ded2ea\") " pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.309635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qkfq\" (UniqueName: \"kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq\") pod \"auto-csr-approver-29533412-gwcm2\" (UID: \"a73ed139-bbb8-49c0-a53d-b09855ded2ea\") " pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.333363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qkfq\" (UniqueName: \"kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq\") pod \"auto-csr-approver-29533412-gwcm2\" (UID: \"a73ed139-bbb8-49c0-a53d-b09855ded2ea\") " pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.468221 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.802682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.858981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:32:00 crc kubenswrapper[4749]: I0225 07:32:00.864320 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533412-gwcm2"] Feb 25 07:32:00 crc kubenswrapper[4749]: W0225 07:32:00.870159 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73ed139_bbb8_49c0_a53d_b09855ded2ea.slice/crio-751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7 WatchSource:0}: Error finding container 751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7: Status 404 returned error can't find the container with id 751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7 Feb 25 07:32:01 crc kubenswrapper[4749]: I0225 07:32:01.737042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" event={"ID":"a73ed139-bbb8-49c0-a53d-b09855ded2ea","Type":"ContainerStarted","Data":"751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7"} Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.181194 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.181277 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.228906 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.371619 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-96dfd7b56-f9wq6" Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.746775 4749 generic.go:334] "Generic (PLEG): container finished" podID="a73ed139-bbb8-49c0-a53d-b09855ded2ea" containerID="468e4c7106d078844a2ce0ddaa61b3252f80439b2616458d61bcdfdc0fcae9dc" exitCode=0 Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.746870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" event={"ID":"a73ed139-bbb8-49c0-a53d-b09855ded2ea","Type":"ContainerDied","Data":"468e4c7106d078844a2ce0ddaa61b3252f80439b2616458d61bcdfdc0fcae9dc"} Feb 25 07:32:02 crc kubenswrapper[4749]: I0225 07:32:02.794860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.003423 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.160366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qkfq\" (UniqueName: \"kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq\") pod \"a73ed139-bbb8-49c0-a53d-b09855ded2ea\" (UID: \"a73ed139-bbb8-49c0-a53d-b09855ded2ea\") " Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.165104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq" (OuterVolumeSpecName: "kube-api-access-5qkfq") pod "a73ed139-bbb8-49c0-a53d-b09855ded2ea" (UID: "a73ed139-bbb8-49c0-a53d-b09855ded2ea"). InnerVolumeSpecName "kube-api-access-5qkfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.261310 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qkfq\" (UniqueName: \"kubernetes.io/projected/a73ed139-bbb8-49c0-a53d-b09855ded2ea-kube-api-access-5qkfq\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.760519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" event={"ID":"a73ed139-bbb8-49c0-a53d-b09855ded2ea","Type":"ContainerDied","Data":"751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7"} Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.760579 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751667d2d4844bbec1f62a0fdacdc3f1507ee3037725eaea459df74c46e183b7" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.760579 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533412-gwcm2" Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.792203 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:32:04 crc kubenswrapper[4749]: I0225 07:32:04.792562 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nb4tl" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="registry-server" containerID="cri-o://abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d" gracePeriod=2 Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.065113 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533406-mbmgt"] Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.068976 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533406-mbmgt"] Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.217881 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.329329 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a286dbe-25c1-4359-98d0-f79d66828da1" path="/var/lib/kubelet/pods/6a286dbe-25c1-4359-98d0-f79d66828da1/volumes" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.377800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content\") pod \"a9981155-4e11-4c94-a286-dcfc5972252b\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.378081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljd4n\" (UniqueName: \"kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n\") pod \"a9981155-4e11-4c94-a286-dcfc5972252b\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.378193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities\") pod \"a9981155-4e11-4c94-a286-dcfc5972252b\" (UID: \"a9981155-4e11-4c94-a286-dcfc5972252b\") " Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.379131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities" (OuterVolumeSpecName: "utilities") pod "a9981155-4e11-4c94-a286-dcfc5972252b" (UID: "a9981155-4e11-4c94-a286-dcfc5972252b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.385202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n" (OuterVolumeSpecName: "kube-api-access-ljd4n") pod "a9981155-4e11-4c94-a286-dcfc5972252b" (UID: "a9981155-4e11-4c94-a286-dcfc5972252b"). InnerVolumeSpecName "kube-api-access-ljd4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.479980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljd4n\" (UniqueName: \"kubernetes.io/projected/a9981155-4e11-4c94-a286-dcfc5972252b-kube-api-access-ljd4n\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.480015 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.548797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9981155-4e11-4c94-a286-dcfc5972252b" (UID: "a9981155-4e11-4c94-a286-dcfc5972252b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.581213 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9981155-4e11-4c94-a286-dcfc5972252b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.767483 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9981155-4e11-4c94-a286-dcfc5972252b" containerID="abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d" exitCode=0 Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.767531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerDied","Data":"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d"} Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.767562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb4tl" event={"ID":"a9981155-4e11-4c94-a286-dcfc5972252b","Type":"ContainerDied","Data":"0d6f6f85c9ed6f9fbee590c358facebe71f4f6e19c13ab0d7225654a887380da"} Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.767583 4749 scope.go:117] "RemoveContainer" containerID="abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.767768 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb4tl" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.796513 4749 scope.go:117] "RemoveContainer" containerID="adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.796552 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.796909 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hszqp" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="registry-server" containerID="cri-o://ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996" gracePeriod=2 Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.810098 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.816164 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nb4tl"] Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.849446 4749 scope.go:117] "RemoveContainer" containerID="63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.877858 4749 scope.go:117] "RemoveContainer" containerID="abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d" Feb 25 07:32:05 crc kubenswrapper[4749]: E0225 07:32:05.878424 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d\": container with ID starting with abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d not found: ID does not exist" containerID="abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.878463 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d"} err="failed to get container status \"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d\": rpc error: code = NotFound desc = could not find container \"abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d\": container with ID starting with abb391ae0eca6f299d517c7234d61b9c0f43d56069d0ca4a8bc5766c6fe5756d not found: ID does not exist" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.878490 4749 scope.go:117] "RemoveContainer" containerID="adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419" Feb 25 07:32:05 crc kubenswrapper[4749]: E0225 07:32:05.878788 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419\": container with ID starting with adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419 not found: ID does not exist" containerID="adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.878814 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419"} err="failed to get container status \"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419\": rpc error: code = NotFound desc = could not find container \"adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419\": container with ID starting with adfe990edc801b696b678f0a5b23e1a330c02a02b95a0d330f4ba213fc060419 not found: ID does not exist" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.878830 4749 scope.go:117] "RemoveContainer" containerID="63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a" Feb 25 07:32:05 crc kubenswrapper[4749]: E0225 07:32:05.879241 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a\": container with ID starting with 63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a not found: ID does not exist" containerID="63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a" Feb 25 07:32:05 crc kubenswrapper[4749]: I0225 07:32:05.879292 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a"} err="failed to get container status \"63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a\": rpc error: code = NotFound desc = could not find container \"63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a\": container with ID starting with 63a99ec0cb6b4d9d0df55a2542c9f4901598f58216ff6b2f25eb1785e03bcd2a not found: ID does not exist" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.122847 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.290034 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t962\" (UniqueName: \"kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962\") pod \"5aea4381-c806-43f8-b557-58e02dc817ba\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.290094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities\") pod \"5aea4381-c806-43f8-b557-58e02dc817ba\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.290116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content\") pod \"5aea4381-c806-43f8-b557-58e02dc817ba\" (UID: \"5aea4381-c806-43f8-b557-58e02dc817ba\") " Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.302140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities" (OuterVolumeSpecName: "utilities") pod "5aea4381-c806-43f8-b557-58e02dc817ba" (UID: "5aea4381-c806-43f8-b557-58e02dc817ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.304935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962" (OuterVolumeSpecName: "kube-api-access-8t962") pod "5aea4381-c806-43f8-b557-58e02dc817ba" (UID: "5aea4381-c806-43f8-b557-58e02dc817ba"). InnerVolumeSpecName "kube-api-access-8t962". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.364464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5aea4381-c806-43f8-b557-58e02dc817ba" (UID: "5aea4381-c806-43f8-b557-58e02dc817ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.391724 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t962\" (UniqueName: \"kubernetes.io/projected/5aea4381-c806-43f8-b557-58e02dc817ba-kube-api-access-8t962\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.391767 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.391781 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5aea4381-c806-43f8-b557-58e02dc817ba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.777818 4749 generic.go:334] "Generic (PLEG): container finished" podID="5aea4381-c806-43f8-b557-58e02dc817ba" containerID="ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996" exitCode=0 Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.777891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hszqp" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.777908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerDied","Data":"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996"} Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.778306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hszqp" event={"ID":"5aea4381-c806-43f8-b557-58e02dc817ba","Type":"ContainerDied","Data":"896f91e5efb1fd44778af576d289d4eca786b1a25bbb99bd46752772c8942980"} Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.778332 4749 scope.go:117] "RemoveContainer" containerID="ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.814563 4749 scope.go:117] "RemoveContainer" containerID="ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.820985 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.825211 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hszqp"] Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.833874 4749 scope.go:117] "RemoveContainer" containerID="74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.857589 4749 scope.go:117] "RemoveContainer" containerID="ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996" Feb 25 07:32:06 crc kubenswrapper[4749]: E0225 07:32:06.858136 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996\": container with ID starting with ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996 not found: ID does not exist" containerID="ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.858202 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996"} err="failed to get container status \"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996\": rpc error: code = NotFound desc = could not find container \"ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996\": container with ID starting with ae4e98da4120949a21458f83da8b66094393c842103d054c040a67a4a5290996 not found: ID does not exist" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.858245 4749 scope.go:117] "RemoveContainer" containerID="ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393" Feb 25 07:32:06 crc kubenswrapper[4749]: E0225 07:32:06.858729 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393\": container with ID starting with ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393 not found: ID does not exist" containerID="ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.858776 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393"} err="failed to get container status \"ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393\": rpc error: code = NotFound desc = could not find container \"ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393\": container with ID starting with ec52a84e834dde29182c6fe7a94cebd30fa78969b8eb9da69dac1c74abae6393 not found: ID does not exist" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.858824 4749 scope.go:117] "RemoveContainer" containerID="74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836" Feb 25 07:32:06 crc kubenswrapper[4749]: E0225 07:32:06.859275 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836\": container with ID starting with 74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836 not found: ID does not exist" containerID="74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836" Feb 25 07:32:06 crc kubenswrapper[4749]: I0225 07:32:06.859307 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836"} err="failed to get container status \"74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836\": rpc error: code = NotFound desc = could not find container \"74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836\": container with ID starting with 74be01bff1ad1fdf262db7121ad83fa7b47b81eafa19fb761c696f549c2d4836 not found: ID does not exist" Feb 25 07:32:07 crc kubenswrapper[4749]: I0225 07:32:07.336574 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" path="/var/lib/kubelet/pods/5aea4381-c806-43f8-b557-58e02dc817ba/volumes" Feb 25 07:32:07 crc kubenswrapper[4749]: I0225 07:32:07.338131 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" path="/var/lib/kubelet/pods/a9981155-4e11-4c94-a286-dcfc5972252b/volumes" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.074755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56887f7db6-72j45" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.775720 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8cg2t"] Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776227 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776241 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="extract-utilities" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776262 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="extract-utilities" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776279 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="extract-content" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776284 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="extract-content" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776290 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="extract-utilities" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776296 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="extract-utilities" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776309 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73ed139-bbb8-49c0-a53d-b09855ded2ea" containerName="oc" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776317 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73ed139-bbb8-49c0-a53d-b09855ded2ea" containerName="oc" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776324 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776329 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: E0225 07:32:22.776339 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="extract-content" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776345 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="extract-content" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9981155-4e11-4c94-a286-dcfc5972252b" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776480 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73ed139-bbb8-49c0-a53d-b09855ded2ea" containerName="oc" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.776489 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aea4381-c806-43f8-b557-58e02dc817ba" containerName="registry-server" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.778410 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.781394 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6hrxh" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.781428 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.781560 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.793281 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s"] Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.799582 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.802072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.804311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s"] Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.876351 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-k7r4k"] Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.877145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k7r4k" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.882541 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.882570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-75xp6" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.882576 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.882541 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-startup\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-reloader\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cg5\" (UniqueName: \"kubernetes.io/projected/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-kube-api-access-z7cg5\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-sockets\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bkbb\" (UniqueName: \"kubernetes.io/projected/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-kube-api-access-6bkbb\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.919997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-conf\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.926986 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-jz8zq"] Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.928371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.933497 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 25 07:32:22 crc kubenswrapper[4749]: I0225 07:32:22.939522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-jz8zq"] Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metrics-certs\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metallb-excludel2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-startup\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-reloader\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cg5\" (UniqueName: \"kubernetes.io/projected/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-kube-api-access-z7cg5\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021398 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvvh\" (UniqueName: \"kubernetes.io/projected/9c8c2184-5984-4567-9de5-0141b3fc7fcd-kube-api-access-btvvh\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-sockets\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-cert\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5hw2\" (UniqueName: \"kubernetes.io/projected/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-kube-api-access-x5hw2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bkbb\" (UniqueName: \"kubernetes.io/projected/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-kube-api-access-6bkbb\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-conf\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.021538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-metrics-certs\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.022051 4749 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.022780 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs podName:82f32e7a-8ffc-4429-bfd7-a12db75ed64a nodeName:}" failed. No retries permitted until 2026-02-25 07:32:23.522762394 +0000 UTC m=+896.884588414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs") pod "frr-k8s-8cg2t" (UID: "82f32e7a-8ffc-4429-bfd7-a12db75ed64a") : secret "frr-k8s-certs-secret" not found Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.022648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-sockets\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.022509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.022956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-conf\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.022726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-reloader\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.023329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-frr-startup\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.034378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.048706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bkbb\" (UniqueName: \"kubernetes.io/projected/3bb90660-64f8-43f6-b7c0-a4449f75c9fb-kube-api-access-6bkbb\") pod \"frr-k8s-webhook-server-78b44bf5bb-lh45s\" (UID: \"3bb90660-64f8-43f6-b7c0-a4449f75c9fb\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.058027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cg5\" (UniqueName: \"kubernetes.io/projected/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-kube-api-access-z7cg5\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.118561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-metrics-certs\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metrics-certs\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metallb-excludel2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvvh\" (UniqueName: \"kubernetes.io/projected/9c8c2184-5984-4567-9de5-0141b3fc7fcd-kube-api-access-btvvh\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-cert\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.122984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5hw2\" (UniqueName: \"kubernetes.io/projected/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-kube-api-access-x5hw2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.123003 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.123230 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist podName:f9ece4c1-da53-4718-8a3a-aa9c0fd930da nodeName:}" failed. No retries permitted until 2026-02-25 07:32:23.623206353 +0000 UTC m=+896.985032383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist") pod "speaker-k7r4k" (UID: "f9ece4c1-da53-4718-8a3a-aa9c0fd930da") : secret "metallb-memberlist" not found Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.123040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metallb-excludel2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.126154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-cert\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.126315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c8c2184-5984-4567-9de5-0141b3fc7fcd-metrics-certs\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.126564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-metrics-certs\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.139839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvvh\" (UniqueName: \"kubernetes.io/projected/9c8c2184-5984-4567-9de5-0141b3fc7fcd-kube-api-access-btvvh\") pod \"controller-69bbfbf88f-jz8zq\" (UID: \"9c8c2184-5984-4567-9de5-0141b3fc7fcd\") " pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.144640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5hw2\" (UniqueName: \"kubernetes.io/projected/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-kube-api-access-x5hw2\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.259071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.314953 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s"] Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.527976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.532322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82f32e7a-8ffc-4429-bfd7-a12db75ed64a-metrics-certs\") pod \"frr-k8s-8cg2t\" (UID: \"82f32e7a-8ffc-4429-bfd7-a12db75ed64a\") " pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.629830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.629977 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 25 07:32:23 crc kubenswrapper[4749]: E0225 07:32:23.630040 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist podName:f9ece4c1-da53-4718-8a3a-aa9c0fd930da nodeName:}" failed. No retries permitted until 2026-02-25 07:32:24.630022654 +0000 UTC m=+897.991848664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist") pod "speaker-k7r4k" (UID: "f9ece4c1-da53-4718-8a3a-aa9c0fd930da") : secret "metallb-memberlist" not found Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.655228 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-jz8zq"] Feb 25 07:32:23 crc kubenswrapper[4749]: W0225 07:32:23.665548 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8c2184_5984_4567_9de5_0141b3fc7fcd.slice/crio-9d01862c91f54dc5bf9d3d160e9046b3b3035bbd9b47a164cc52d53ad090a9a6 WatchSource:0}: Error finding container 9d01862c91f54dc5bf9d3d160e9046b3b3035bbd9b47a164cc52d53ad090a9a6: Status 404 returned error can't find the container with id 9d01862c91f54dc5bf9d3d160e9046b3b3035bbd9b47a164cc52d53ad090a9a6 Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.694956 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.904075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" event={"ID":"3bb90660-64f8-43f6-b7c0-a4449f75c9fb","Type":"ContainerStarted","Data":"650c13a79edd8232aaa676079975f80b7a55a21b2ceb33d74742571fe3d85062"} Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.907025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jz8zq" event={"ID":"9c8c2184-5984-4567-9de5-0141b3fc7fcd","Type":"ContainerStarted","Data":"2e5fa383c126f1586c994895956d9be45368d0f575644483e1a345e2f32247f4"} Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.907081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jz8zq" event={"ID":"9c8c2184-5984-4567-9de5-0141b3fc7fcd","Type":"ContainerStarted","Data":"9d01862c91f54dc5bf9d3d160e9046b3b3035bbd9b47a164cc52d53ad090a9a6"} Feb 25 07:32:23 crc kubenswrapper[4749]: I0225 07:32:23.908469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"bebe59b6da5f58427d73a36be330f747018bca35671515ebd31f30a2eb648351"} Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.642828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.648038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f9ece4c1-da53-4718-8a3a-aa9c0fd930da-memberlist\") pod \"speaker-k7r4k\" (UID: \"f9ece4c1-da53-4718-8a3a-aa9c0fd930da\") " pod="metallb-system/speaker-k7r4k" Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.690887 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-k7r4k" Feb 25 07:32:24 crc kubenswrapper[4749]: W0225 07:32:24.727229 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ece4c1_da53_4718_8a3a_aa9c0fd930da.slice/crio-77c7cfa4c8185c5b33f71d582183a78dbf5df521db2902526d687afb63c08146 WatchSource:0}: Error finding container 77c7cfa4c8185c5b33f71d582183a78dbf5df521db2902526d687afb63c08146: Status 404 returned error can't find the container with id 77c7cfa4c8185c5b33f71d582183a78dbf5df521db2902526d687afb63c08146 Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.928163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k7r4k" event={"ID":"f9ece4c1-da53-4718-8a3a-aa9c0fd930da","Type":"ContainerStarted","Data":"77c7cfa4c8185c5b33f71d582183a78dbf5df521db2902526d687afb63c08146"} Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.929836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-jz8zq" event={"ID":"9c8c2184-5984-4567-9de5-0141b3fc7fcd","Type":"ContainerStarted","Data":"d9e97cbd4cdbcc982e3f991ae0051d78994658a8a49e81745a4699565e666316"} Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.929966 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:24 crc kubenswrapper[4749]: I0225 07:32:24.948971 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-jz8zq" podStartSLOduration=2.948952394 podStartE2EDuration="2.948952394s" podCreationTimestamp="2026-02-25 07:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:32:24.943938098 +0000 UTC m=+898.305764118" watchObservedRunningTime="2026-02-25 07:32:24.948952394 +0000 UTC m=+898.310778424" Feb 25 07:32:25 crc kubenswrapper[4749]: I0225 07:32:25.940315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k7r4k" event={"ID":"f9ece4c1-da53-4718-8a3a-aa9c0fd930da","Type":"ContainerStarted","Data":"3a2fa0d108e8c39fbf3dfcc47ab50a531591cd59140d5da4611bad7f13b3d3e2"} Feb 25 07:32:25 crc kubenswrapper[4749]: I0225 07:32:25.940713 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-k7r4k" Feb 25 07:32:25 crc kubenswrapper[4749]: I0225 07:32:25.940728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-k7r4k" event={"ID":"f9ece4c1-da53-4718-8a3a-aa9c0fd930da","Type":"ContainerStarted","Data":"7dba11cd7e489024902a17951a7627d702437ec49ff8ab55619d9bdb8c93acc1"} Feb 25 07:32:25 crc kubenswrapper[4749]: I0225 07:32:25.959847 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-k7r4k" podStartSLOduration=3.959827101 podStartE2EDuration="3.959827101s" podCreationTimestamp="2026-02-25 07:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:32:25.955551091 +0000 UTC m=+899.317377111" watchObservedRunningTime="2026-02-25 07:32:25.959827101 +0000 UTC m=+899.321653121" Feb 25 07:32:30 crc kubenswrapper[4749]: I0225 07:32:30.982442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" event={"ID":"3bb90660-64f8-43f6-b7c0-a4449f75c9fb","Type":"ContainerStarted","Data":"7146b84c0cd748ab21eb42bb586fa3b71646d5472c15c20b86d679198dc74036"} Feb 25 07:32:30 crc kubenswrapper[4749]: I0225 07:32:30.983080 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:30 crc kubenswrapper[4749]: I0225 07:32:30.986843 4749 generic.go:334] "Generic (PLEG): container finished" podID="82f32e7a-8ffc-4429-bfd7-a12db75ed64a" containerID="08a023b7a1656c1e5e44f1bb6d93b366181ea47fa345b8217ba6dc6dc1d5c05d" exitCode=0 Feb 25 07:32:30 crc kubenswrapper[4749]: I0225 07:32:30.986905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerDied","Data":"08a023b7a1656c1e5e44f1bb6d93b366181ea47fa345b8217ba6dc6dc1d5c05d"} Feb 25 07:32:31 crc kubenswrapper[4749]: I0225 07:32:31.001758 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" podStartSLOduration=2.196067085 podStartE2EDuration="9.001739688s" podCreationTimestamp="2026-02-25 07:32:22 +0000 UTC" firstStartedPulling="2026-02-25 07:32:23.321100672 +0000 UTC m=+896.682926692" lastFinishedPulling="2026-02-25 07:32:30.126773275 +0000 UTC m=+903.488599295" observedRunningTime="2026-02-25 07:32:31.001262657 +0000 UTC m=+904.363088747" watchObservedRunningTime="2026-02-25 07:32:31.001739688 +0000 UTC m=+904.363565718" Feb 25 07:32:31 crc kubenswrapper[4749]: I0225 07:32:31.999072 4749 generic.go:334] "Generic (PLEG): container finished" podID="82f32e7a-8ffc-4429-bfd7-a12db75ed64a" containerID="fc47077603d0f2bc056e8b3f77d856e1f16fcdcfd9f0c5cd286f75e13001f9ff" exitCode=0 Feb 25 07:32:31 crc kubenswrapper[4749]: I0225 07:32:31.999178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerDied","Data":"fc47077603d0f2bc056e8b3f77d856e1f16fcdcfd9f0c5cd286f75e13001f9ff"} Feb 25 07:32:33 crc kubenswrapper[4749]: I0225 07:32:33.009139 4749 generic.go:334] "Generic (PLEG): container finished" podID="82f32e7a-8ffc-4429-bfd7-a12db75ed64a" containerID="4b1a797ab2cf177405131ef2af38d268a20873f3687af8de594327f943d1e1a9" exitCode=0 Feb 25 07:32:33 crc kubenswrapper[4749]: I0225 07:32:33.009229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerDied","Data":"4b1a797ab2cf177405131ef2af38d268a20873f3687af8de594327f943d1e1a9"} Feb 25 07:32:33 crc kubenswrapper[4749]: I0225 07:32:33.264581 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-jz8zq" Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.024078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"8160e4457ba789735dfac2831096dc26861246e3807e06577effe59cbbf35da0"} Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.024395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"444c0c2e4219937462c1ceedad5b2225093b12b1135a676b5c3c319f7abd4a66"} Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.024408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"7ea6273c1cc2ec450d48ae142d391a901d8fe0ff10705412cd7bfcf2f78a4996"} Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.024419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"c285daeeb3b20af6a0a9184de94cbb99242578e4c122984b6f89e18efc4452d2"} Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.024428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"504b07a8e9c08a29ccb6069b4d9319d14ae2b5a5f78a6fd6e1e95b398ed17c19"} Feb 25 07:32:34 crc kubenswrapper[4749]: I0225 07:32:34.551105 4749 scope.go:117] "RemoveContainer" containerID="76b5984e3533edb5fec5a64059ed8fcbdb292796c2c8ee1eaebe159be241260d" Feb 25 07:32:35 crc kubenswrapper[4749]: I0225 07:32:35.049834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8cg2t" event={"ID":"82f32e7a-8ffc-4429-bfd7-a12db75ed64a","Type":"ContainerStarted","Data":"8d8d93fc350679731e9cef0c3c1dd9243ca4d7163d9bf7f2c588ae7219de0003"} Feb 25 07:32:35 crc kubenswrapper[4749]: I0225 07:32:35.052050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:35 crc kubenswrapper[4749]: I0225 07:32:35.103940 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8cg2t" podStartSLOduration=6.794483385 podStartE2EDuration="13.103906864s" podCreationTimestamp="2026-02-25 07:32:22 +0000 UTC" firstStartedPulling="2026-02-25 07:32:23.834094996 +0000 UTC m=+897.195921056" lastFinishedPulling="2026-02-25 07:32:30.143518515 +0000 UTC m=+903.505344535" observedRunningTime="2026-02-25 07:32:35.085471615 +0000 UTC m=+908.447297665" watchObservedRunningTime="2026-02-25 07:32:35.103906864 +0000 UTC m=+908.465732924" Feb 25 07:32:38 crc kubenswrapper[4749]: I0225 07:32:38.695955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:38 crc kubenswrapper[4749]: I0225 07:32:38.746610 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:43 crc kubenswrapper[4749]: I0225 07:32:43.126014 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-lh45s" Feb 25 07:32:43 crc kubenswrapper[4749]: I0225 07:32:43.699486 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8cg2t" Feb 25 07:32:44 crc kubenswrapper[4749]: I0225 07:32:44.694058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-k7r4k" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.672921 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.679037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.684547 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.685237 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.685459 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vpj8s" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.688439 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.800829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqshl\" (UniqueName: \"kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl\") pod \"openstack-operator-index-mxxvr\" (UID: \"40c9b07f-4b8d-4651-b561-2f7355e6d991\") " pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.901817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqshl\" (UniqueName: \"kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl\") pod \"openstack-operator-index-mxxvr\" (UID: \"40c9b07f-4b8d-4651-b561-2f7355e6d991\") " pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:47 crc kubenswrapper[4749]: I0225 07:32:47.922040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqshl\" (UniqueName: \"kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl\") pod \"openstack-operator-index-mxxvr\" (UID: \"40c9b07f-4b8d-4651-b561-2f7355e6d991\") " pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:48 crc kubenswrapper[4749]: I0225 07:32:48.004645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:48 crc kubenswrapper[4749]: I0225 07:32:48.478294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:49 crc kubenswrapper[4749]: I0225 07:32:49.169903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxxvr" event={"ID":"40c9b07f-4b8d-4651-b561-2f7355e6d991","Type":"ContainerStarted","Data":"d2e23eabc4ee7abc89c4467e64661702273424613ac9c70fa6d432c9e1b3c842"} Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.015543 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.629290 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tk29m"] Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.630099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.638301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tk29m"] Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.762316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xqr\" (UniqueName: \"kubernetes.io/projected/7b2662ca-97d0-4f81-b90a-c2735bd2a62a-kube-api-access-w5xqr\") pod \"openstack-operator-index-tk29m\" (UID: \"7b2662ca-97d0-4f81-b90a-c2735bd2a62a\") " pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.863775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xqr\" (UniqueName: \"kubernetes.io/projected/7b2662ca-97d0-4f81-b90a-c2735bd2a62a-kube-api-access-w5xqr\") pod \"openstack-operator-index-tk29m\" (UID: \"7b2662ca-97d0-4f81-b90a-c2735bd2a62a\") " pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.899298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xqr\" (UniqueName: \"kubernetes.io/projected/7b2662ca-97d0-4f81-b90a-c2735bd2a62a-kube-api-access-w5xqr\") pod \"openstack-operator-index-tk29m\" (UID: \"7b2662ca-97d0-4f81-b90a-c2735bd2a62a\") " pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:32:51 crc kubenswrapper[4749]: I0225 07:32:51.997470 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:32:52 crc kubenswrapper[4749]: I0225 07:32:52.191004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxxvr" event={"ID":"40c9b07f-4b8d-4651-b561-2f7355e6d991","Type":"ContainerStarted","Data":"41e9100b1bfb56cfe9665422bbeb4b85d7d45e7cd3b57ce8f88e8ccd8569584a"} Feb 25 07:32:52 crc kubenswrapper[4749]: I0225 07:32:52.191168 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mxxvr" podUID="40c9b07f-4b8d-4651-b561-2f7355e6d991" containerName="registry-server" containerID="cri-o://41e9100b1bfb56cfe9665422bbeb4b85d7d45e7cd3b57ce8f88e8ccd8569584a" gracePeriod=2 Feb 25 07:32:52 crc kubenswrapper[4749]: I0225 07:32:52.234510 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mxxvr" podStartSLOduration=1.9245404449999999 podStartE2EDuration="5.234479524s" podCreationTimestamp="2026-02-25 07:32:47 +0000 UTC" firstStartedPulling="2026-02-25 07:32:48.490250054 +0000 UTC m=+921.852076074" lastFinishedPulling="2026-02-25 07:32:51.800189123 +0000 UTC m=+925.162015153" observedRunningTime="2026-02-25 07:32:52.211920919 +0000 UTC m=+925.573746949" watchObservedRunningTime="2026-02-25 07:32:52.234479524 +0000 UTC m=+925.596305574" Feb 25 07:32:52 crc kubenswrapper[4749]: I0225 07:32:52.240484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tk29m"] Feb 25 07:32:52 crc kubenswrapper[4749]: W0225 07:32:52.251729 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2662ca_97d0_4f81_b90a_c2735bd2a62a.slice/crio-d06633dce1bf6143f3bffce4adbcdc13e2a582076bfe25aa26fc8cd21ce65504 WatchSource:0}: Error finding container d06633dce1bf6143f3bffce4adbcdc13e2a582076bfe25aa26fc8cd21ce65504: Status 404 returned error can't find the container with id d06633dce1bf6143f3bffce4adbcdc13e2a582076bfe25aa26fc8cd21ce65504 Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.222801 4749 generic.go:334] "Generic (PLEG): container finished" podID="40c9b07f-4b8d-4651-b561-2f7355e6d991" containerID="41e9100b1bfb56cfe9665422bbeb4b85d7d45e7cd3b57ce8f88e8ccd8569584a" exitCode=0 Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.222899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxxvr" event={"ID":"40c9b07f-4b8d-4651-b561-2f7355e6d991","Type":"ContainerDied","Data":"41e9100b1bfb56cfe9665422bbeb4b85d7d45e7cd3b57ce8f88e8ccd8569584a"} Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.231328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tk29m" event={"ID":"7b2662ca-97d0-4f81-b90a-c2735bd2a62a","Type":"ContainerStarted","Data":"d06633dce1bf6143f3bffce4adbcdc13e2a582076bfe25aa26fc8cd21ce65504"} Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.430966 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.593376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqshl\" (UniqueName: \"kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl\") pod \"40c9b07f-4b8d-4651-b561-2f7355e6d991\" (UID: \"40c9b07f-4b8d-4651-b561-2f7355e6d991\") " Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.599769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl" (OuterVolumeSpecName: "kube-api-access-jqshl") pod "40c9b07f-4b8d-4651-b561-2f7355e6d991" (UID: "40c9b07f-4b8d-4651-b561-2f7355e6d991"). InnerVolumeSpecName "kube-api-access-jqshl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:32:53 crc kubenswrapper[4749]: I0225 07:32:53.695094 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqshl\" (UniqueName: \"kubernetes.io/projected/40c9b07f-4b8d-4651-b561-2f7355e6d991-kube-api-access-jqshl\") on node \"crc\" DevicePath \"\"" Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.239812 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mxxvr" Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.239842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mxxvr" event={"ID":"40c9b07f-4b8d-4651-b561-2f7355e6d991","Type":"ContainerDied","Data":"d2e23eabc4ee7abc89c4467e64661702273424613ac9c70fa6d432c9e1b3c842"} Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.240236 4749 scope.go:117] "RemoveContainer" containerID="41e9100b1bfb56cfe9665422bbeb4b85d7d45e7cd3b57ce8f88e8ccd8569584a" Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.242981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tk29m" event={"ID":"7b2662ca-97d0-4f81-b90a-c2735bd2a62a","Type":"ContainerStarted","Data":"3ccaa6514dc0c83d96088344639b94674a1fca3a639a0e3749a9e48cd15d2d27"} Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.266798 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tk29m" podStartSLOduration=1.9964960170000001 podStartE2EDuration="3.266766884s" podCreationTimestamp="2026-02-25 07:32:51 +0000 UTC" firstStartedPulling="2026-02-25 07:32:52.256691422 +0000 UTC m=+925.618517452" lastFinishedPulling="2026-02-25 07:32:53.526962279 +0000 UTC m=+926.888788319" observedRunningTime="2026-02-25 07:32:54.266193231 +0000 UTC m=+927.628019291" watchObservedRunningTime="2026-02-25 07:32:54.266766884 +0000 UTC m=+927.628592944" Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.290937 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:54 crc kubenswrapper[4749]: I0225 07:32:54.298918 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mxxvr"] Feb 25 07:32:55 crc kubenswrapper[4749]: I0225 07:32:55.339663 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c9b07f-4b8d-4651-b561-2f7355e6d991" path="/var/lib/kubelet/pods/40c9b07f-4b8d-4651-b561-2f7355e6d991/volumes" Feb 25 07:33:01 crc kubenswrapper[4749]: I0225 07:33:01.998142 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:33:01 crc kubenswrapper[4749]: I0225 07:33:01.998707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:33:02 crc kubenswrapper[4749]: I0225 07:33:02.043537 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:33:02 crc kubenswrapper[4749]: I0225 07:33:02.407971 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tk29m" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.494763 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t"] Feb 25 07:33:08 crc kubenswrapper[4749]: E0225 07:33:08.495349 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c9b07f-4b8d-4651-b561-2f7355e6d991" containerName="registry-server" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.495362 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c9b07f-4b8d-4651-b561-2f7355e6d991" containerName="registry-server" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.495511 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c9b07f-4b8d-4651-b561-2f7355e6d991" containerName="registry-server" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.496465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.502489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-g5648" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.512058 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t"] Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.629008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.629085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsggz\" (UniqueName: \"kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.629174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.731101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.731505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.731629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsggz\" (UniqueName: \"kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.731745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.732483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.758409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsggz\" (UniqueName: \"kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz\") pod \"0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:08 crc kubenswrapper[4749]: I0225 07:33:08.821206 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:09 crc kubenswrapper[4749]: I0225 07:33:09.037146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t"] Feb 25 07:33:09 crc kubenswrapper[4749]: I0225 07:33:09.361444 4749 generic.go:334] "Generic (PLEG): container finished" podID="08be220c-12a9-4f49-b123-7a1413328415" containerID="ecb17c558e1d39cb566391f408e6134a2900487900a1f3e83729f4070c3c734e" exitCode=0 Feb 25 07:33:09 crc kubenswrapper[4749]: I0225 07:33:09.361585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" event={"ID":"08be220c-12a9-4f49-b123-7a1413328415","Type":"ContainerDied","Data":"ecb17c558e1d39cb566391f408e6134a2900487900a1f3e83729f4070c3c734e"} Feb 25 07:33:09 crc kubenswrapper[4749]: I0225 07:33:09.361645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" event={"ID":"08be220c-12a9-4f49-b123-7a1413328415","Type":"ContainerStarted","Data":"68b5ba35ca3ce1eeb3efee3360a4761ef2cc4d680edf19974992c06362ab81b0"} Feb 25 07:33:10 crc kubenswrapper[4749]: I0225 07:33:10.374447 4749 generic.go:334] "Generic (PLEG): container finished" podID="08be220c-12a9-4f49-b123-7a1413328415" containerID="240f47280554260038c5cbcd5a23b5c2b2c699675b2e026bebdd8a10bc585a3d" exitCode=0 Feb 25 07:33:10 crc kubenswrapper[4749]: I0225 07:33:10.374588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" event={"ID":"08be220c-12a9-4f49-b123-7a1413328415","Type":"ContainerDied","Data":"240f47280554260038c5cbcd5a23b5c2b2c699675b2e026bebdd8a10bc585a3d"} Feb 25 07:33:11 crc kubenswrapper[4749]: I0225 07:33:11.388272 4749 generic.go:334] "Generic (PLEG): container finished" podID="08be220c-12a9-4f49-b123-7a1413328415" containerID="ad71a82cea87576614d73c4e6870cb1f41cc51057624962c5b057ddea568fae3" exitCode=0 Feb 25 07:33:11 crc kubenswrapper[4749]: I0225 07:33:11.388373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" event={"ID":"08be220c-12a9-4f49-b123-7a1413328415","Type":"ContainerDied","Data":"ad71a82cea87576614d73c4e6870cb1f41cc51057624962c5b057ddea568fae3"} Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.714675 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.795687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsggz\" (UniqueName: \"kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz\") pod \"08be220c-12a9-4f49-b123-7a1413328415\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.796161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util\") pod \"08be220c-12a9-4f49-b123-7a1413328415\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.796218 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle\") pod \"08be220c-12a9-4f49-b123-7a1413328415\" (UID: \"08be220c-12a9-4f49-b123-7a1413328415\") " Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.797984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle" (OuterVolumeSpecName: "bundle") pod "08be220c-12a9-4f49-b123-7a1413328415" (UID: "08be220c-12a9-4f49-b123-7a1413328415"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.806788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz" (OuterVolumeSpecName: "kube-api-access-vsggz") pod "08be220c-12a9-4f49-b123-7a1413328415" (UID: "08be220c-12a9-4f49-b123-7a1413328415"). InnerVolumeSpecName "kube-api-access-vsggz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.825076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util" (OuterVolumeSpecName: "util") pod "08be220c-12a9-4f49-b123-7a1413328415" (UID: "08be220c-12a9-4f49-b123-7a1413328415"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.897976 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsggz\" (UniqueName: \"kubernetes.io/projected/08be220c-12a9-4f49-b123-7a1413328415-kube-api-access-vsggz\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.898026 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-util\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:12 crc kubenswrapper[4749]: I0225 07:33:12.898045 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08be220c-12a9-4f49-b123-7a1413328415-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:13 crc kubenswrapper[4749]: I0225 07:33:13.407125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" event={"ID":"08be220c-12a9-4f49-b123-7a1413328415","Type":"ContainerDied","Data":"68b5ba35ca3ce1eeb3efee3360a4761ef2cc4d680edf19974992c06362ab81b0"} Feb 25 07:33:13 crc kubenswrapper[4749]: I0225 07:33:13.407352 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b5ba35ca3ce1eeb3efee3360a4761ef2cc4d680edf19974992c06362ab81b0" Feb 25 07:33:13 crc kubenswrapper[4749]: I0225 07:33:13.407203 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.681121 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh"] Feb 25 07:33:20 crc kubenswrapper[4749]: E0225 07:33:20.681854 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="pull" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.681865 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="pull" Feb 25 07:33:20 crc kubenswrapper[4749]: E0225 07:33:20.681875 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="extract" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.681881 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="extract" Feb 25 07:33:20 crc kubenswrapper[4749]: E0225 07:33:20.681890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="util" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.681895 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="util" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.681988 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="08be220c-12a9-4f49-b123-7a1413328415" containerName="extract" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.682357 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.685919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pb6f2" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.719140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh"] Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.816182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjg57\" (UniqueName: \"kubernetes.io/projected/3effb281-c000-4831-89a1-8b85dfc219b3-kube-api-access-cjg57\") pod \"openstack-operator-controller-init-57b9579b7-7f7fh\" (UID: \"3effb281-c000-4831-89a1-8b85dfc219b3\") " pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.916972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjg57\" (UniqueName: \"kubernetes.io/projected/3effb281-c000-4831-89a1-8b85dfc219b3-kube-api-access-cjg57\") pod \"openstack-operator-controller-init-57b9579b7-7f7fh\" (UID: \"3effb281-c000-4831-89a1-8b85dfc219b3\") " pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:20 crc kubenswrapper[4749]: I0225 07:33:20.944744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjg57\" (UniqueName: \"kubernetes.io/projected/3effb281-c000-4831-89a1-8b85dfc219b3-kube-api-access-cjg57\") pod \"openstack-operator-controller-init-57b9579b7-7f7fh\" (UID: \"3effb281-c000-4831-89a1-8b85dfc219b3\") " pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:21 crc kubenswrapper[4749]: I0225 07:33:21.005943 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:21 crc kubenswrapper[4749]: I0225 07:33:21.236651 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh"] Feb 25 07:33:21 crc kubenswrapper[4749]: I0225 07:33:21.466200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" event={"ID":"3effb281-c000-4831-89a1-8b85dfc219b3","Type":"ContainerStarted","Data":"c361c6bb835da07b9c871dbc53412c55b743065a6ccb64397c1253a5b50ab785"} Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.004790 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.005995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.010058 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.063122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.063175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.063210 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44clh\" (UniqueName: \"kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.164965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.165013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.165043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44clh\" (UniqueName: \"kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.166637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.166709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.185206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44clh\" (UniqueName: \"kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh\") pod \"redhat-marketplace-47zwl\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:23 crc kubenswrapper[4749]: I0225 07:33:23.333505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.206898 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:25 crc kubenswrapper[4749]: W0225 07:33:25.218639 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb4ef0a_4209_4f48_bcec_d81d19ef3db5.slice/crio-2673d7f9f909a384e31c0f55342817b41bdf88f60defac6ead5c42574eb37402 WatchSource:0}: Error finding container 2673d7f9f909a384e31c0f55342817b41bdf88f60defac6ead5c42574eb37402: Status 404 returned error can't find the container with id 2673d7f9f909a384e31c0f55342817b41bdf88f60defac6ead5c42574eb37402 Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.496043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" event={"ID":"3effb281-c000-4831-89a1-8b85dfc219b3","Type":"ContainerStarted","Data":"a58be4cab88357bf8356b1c4ac6361435e0c9479fe300f121b9f3fb04a2b567d"} Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.496158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.498284 4749 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerID="3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d" exitCode=0 Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.498323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerDied","Data":"3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d"} Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.498346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerStarted","Data":"2673d7f9f909a384e31c0f55342817b41bdf88f60defac6ead5c42574eb37402"} Feb 25 07:33:25 crc kubenswrapper[4749]: I0225 07:33:25.546305 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" podStartSLOduration=1.8939993849999999 podStartE2EDuration="5.546287698s" podCreationTimestamp="2026-02-25 07:33:20 +0000 UTC" firstStartedPulling="2026-02-25 07:33:21.246776691 +0000 UTC m=+954.608602711" lastFinishedPulling="2026-02-25 07:33:24.899065004 +0000 UTC m=+958.260891024" observedRunningTime="2026-02-25 07:33:25.542417724 +0000 UTC m=+958.904243754" watchObservedRunningTime="2026-02-25 07:33:25.546287698 +0000 UTC m=+958.908113718" Feb 25 07:33:26 crc kubenswrapper[4749]: I0225 07:33:26.513779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerStarted","Data":"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da"} Feb 25 07:33:27 crc kubenswrapper[4749]: I0225 07:33:27.525810 4749 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerID="ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da" exitCode=0 Feb 25 07:33:27 crc kubenswrapper[4749]: I0225 07:33:27.525865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerDied","Data":"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da"} Feb 25 07:33:28 crc kubenswrapper[4749]: I0225 07:33:28.537324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerStarted","Data":"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a"} Feb 25 07:33:28 crc kubenswrapper[4749]: I0225 07:33:28.561487 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-47zwl" podStartSLOduration=4.122511077 podStartE2EDuration="6.561460301s" podCreationTimestamp="2026-02-25 07:33:22 +0000 UTC" firstStartedPulling="2026-02-25 07:33:25.501159088 +0000 UTC m=+958.862985108" lastFinishedPulling="2026-02-25 07:33:27.940108292 +0000 UTC m=+961.301934332" observedRunningTime="2026-02-25 07:33:28.557931247 +0000 UTC m=+961.919757297" watchObservedRunningTime="2026-02-25 07:33:28.561460301 +0000 UTC m=+961.923286341" Feb 25 07:33:31 crc kubenswrapper[4749]: I0225 07:33:31.008649 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57b9579b7-7f7fh" Feb 25 07:33:33 crc kubenswrapper[4749]: I0225 07:33:33.336706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:33 crc kubenswrapper[4749]: I0225 07:33:33.337186 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:33 crc kubenswrapper[4749]: I0225 07:33:33.422586 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:33 crc kubenswrapper[4749]: I0225 07:33:33.607512 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:35 crc kubenswrapper[4749]: I0225 07:33:35.784009 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:35 crc kubenswrapper[4749]: I0225 07:33:35.784745 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-47zwl" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="registry-server" containerID="cri-o://72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a" gracePeriod=2 Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.226718 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.362087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44clh\" (UniqueName: \"kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh\") pod \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.362410 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities\") pod \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.363099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content\") pod \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\" (UID: \"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5\") " Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.363833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities" (OuterVolumeSpecName: "utilities") pod "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" (UID: "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.368129 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh" (OuterVolumeSpecName: "kube-api-access-44clh") pod "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" (UID: "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5"). InnerVolumeSpecName "kube-api-access-44clh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.392669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" (UID: "eeb4ef0a-4209-4f48-bcec-d81d19ef3db5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.465444 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.465847 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.465871 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44clh\" (UniqueName: \"kubernetes.io/projected/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5-kube-api-access-44clh\") on node \"crc\" DevicePath \"\"" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.592500 4749 generic.go:334] "Generic (PLEG): container finished" podID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerID="72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a" exitCode=0 Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.592538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerDied","Data":"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a"} Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.592564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47zwl" event={"ID":"eeb4ef0a-4209-4f48-bcec-d81d19ef3db5","Type":"ContainerDied","Data":"2673d7f9f909a384e31c0f55342817b41bdf88f60defac6ead5c42574eb37402"} Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.592565 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47zwl" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.592579 4749 scope.go:117] "RemoveContainer" containerID="72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.608813 4749 scope.go:117] "RemoveContainer" containerID="ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.621141 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.625230 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-47zwl"] Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.642675 4749 scope.go:117] "RemoveContainer" containerID="3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.655060 4749 scope.go:117] "RemoveContainer" containerID="72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a" Feb 25 07:33:36 crc kubenswrapper[4749]: E0225 07:33:36.655484 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a\": container with ID starting with 72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a not found: ID does not exist" containerID="72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.655539 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a"} err="failed to get container status \"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a\": rpc error: code = NotFound desc = could not find container \"72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a\": container with ID starting with 72e537b4fef651cb415e5894277fce9f10d842f851e7e3ef880da7a0fed98c4a not found: ID does not exist" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.655569 4749 scope.go:117] "RemoveContainer" containerID="ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da" Feb 25 07:33:36 crc kubenswrapper[4749]: E0225 07:33:36.656071 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da\": container with ID starting with ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da not found: ID does not exist" containerID="ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.656103 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da"} err="failed to get container status \"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da\": rpc error: code = NotFound desc = could not find container \"ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da\": container with ID starting with ba5f12573a883c954ed626858948bfd4503dca541426416f89ec18276eb836da not found: ID does not exist" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.656123 4749 scope.go:117] "RemoveContainer" containerID="3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d" Feb 25 07:33:36 crc kubenswrapper[4749]: E0225 07:33:36.656322 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d\": container with ID starting with 3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d not found: ID does not exist" containerID="3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d" Feb 25 07:33:36 crc kubenswrapper[4749]: I0225 07:33:36.656357 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d"} err="failed to get container status \"3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d\": rpc error: code = NotFound desc = could not find container \"3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d\": container with ID starting with 3aeabb1db0266239ce334f80d5ac16e1667ecda9b2fe15f94c58a72b30995a8d not found: ID does not exist" Feb 25 07:33:37 crc kubenswrapper[4749]: I0225 07:33:37.335377 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" path="/var/lib/kubelet/pods/eeb4ef0a-4209-4f48-bcec-d81d19ef3db5/volumes" Feb 25 07:33:51 crc kubenswrapper[4749]: I0225 07:33:51.672040 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:33:51 crc kubenswrapper[4749]: I0225 07:33:51.672806 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.153538 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533414-x578f"] Feb 25 07:34:00 crc kubenswrapper[4749]: E0225 07:34:00.154390 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="extract-utilities" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.154405 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="extract-utilities" Feb 25 07:34:00 crc kubenswrapper[4749]: E0225 07:34:00.154429 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="extract-content" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.154438 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="extract-content" Feb 25 07:34:00 crc kubenswrapper[4749]: E0225 07:34:00.154451 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="registry-server" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.154459 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="registry-server" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.154612 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb4ef0a-4209-4f48-bcec-d81d19ef3db5" containerName="registry-server" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.155076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.161791 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.161846 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.162149 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.168574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533414-x578f"] Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.194935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-467tr\" (UniqueName: \"kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr\") pod \"auto-csr-approver-29533414-x578f\" (UID: \"d3089009-af37-4a9f-b86c-64282651c575\") " pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.296303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-467tr\" (UniqueName: \"kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr\") pod \"auto-csr-approver-29533414-x578f\" (UID: \"d3089009-af37-4a9f-b86c-64282651c575\") " pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.313418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-467tr\" (UniqueName: \"kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr\") pod \"auto-csr-approver-29533414-x578f\" (UID: \"d3089009-af37-4a9f-b86c-64282651c575\") " pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.476009 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.751352 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533414-x578f"] Feb 25 07:34:00 crc kubenswrapper[4749]: I0225 07:34:00.778757 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533414-x578f" event={"ID":"d3089009-af37-4a9f-b86c-64282651c575","Type":"ContainerStarted","Data":"003d49f5505a403d8bdb36bd4f228b1fb1df435e92067c7bf5cb0ec26b47f7e9"} Feb 25 07:34:02 crc kubenswrapper[4749]: I0225 07:34:02.792647 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3089009-af37-4a9f-b86c-64282651c575" containerID="0ee6f92d7b6d3121615e47838af3ebed4597f6f31a2aa6b41501cbcd9488b75e" exitCode=0 Feb 25 07:34:02 crc kubenswrapper[4749]: I0225 07:34:02.792712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533414-x578f" event={"ID":"d3089009-af37-4a9f-b86c-64282651c575","Type":"ContainerDied","Data":"0ee6f92d7b6d3121615e47838af3ebed4597f6f31a2aa6b41501cbcd9488b75e"} Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.057623 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.199226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-467tr\" (UniqueName: \"kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr\") pod \"d3089009-af37-4a9f-b86c-64282651c575\" (UID: \"d3089009-af37-4a9f-b86c-64282651c575\") " Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.206373 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr" (OuterVolumeSpecName: "kube-api-access-467tr") pod "d3089009-af37-4a9f-b86c-64282651c575" (UID: "d3089009-af37-4a9f-b86c-64282651c575"). InnerVolumeSpecName "kube-api-access-467tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.301544 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-467tr\" (UniqueName: \"kubernetes.io/projected/d3089009-af37-4a9f-b86c-64282651c575-kube-api-access-467tr\") on node \"crc\" DevicePath \"\"" Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.806171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533414-x578f" event={"ID":"d3089009-af37-4a9f-b86c-64282651c575","Type":"ContainerDied","Data":"003d49f5505a403d8bdb36bd4f228b1fb1df435e92067c7bf5cb0ec26b47f7e9"} Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.806208 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003d49f5505a403d8bdb36bd4f228b1fb1df435e92067c7bf5cb0ec26b47f7e9" Feb 25 07:34:04 crc kubenswrapper[4749]: I0225 07:34:04.806243 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533414-x578f" Feb 25 07:34:05 crc kubenswrapper[4749]: I0225 07:34:05.119953 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533408-bbf9b"] Feb 25 07:34:05 crc kubenswrapper[4749]: I0225 07:34:05.123442 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533408-bbf9b"] Feb 25 07:34:05 crc kubenswrapper[4749]: I0225 07:34:05.330155 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268d7f08-512d-45a3-b577-5f865420f45e" path="/var/lib/kubelet/pods/268d7f08-512d-45a3-b577-5f865420f45e/volumes" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.684615 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h"] Feb 25 07:34:09 crc kubenswrapper[4749]: E0225 07:34:09.685162 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3089009-af37-4a9f-b86c-64282651c575" containerName="oc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.685200 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3089009-af37-4a9f-b86c-64282651c575" containerName="oc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.685361 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3089009-af37-4a9f-b86c-64282651c575" containerName="oc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.685816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.690157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zjmch" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.695052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.700622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.701512 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.703189 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tmrnh" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.706248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.761256 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.762285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.770157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-sd7pd" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.770557 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.771325 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.775666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.776465 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6frm7" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.782069 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.791658 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.792454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.795250 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.796043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.798653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-hjdzs" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.805952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-frbfk" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.805994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.806354 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.823928 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.825251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.830014 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.830028 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-drzcl" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.847441 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.871119 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.872255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.875008 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dw4w5" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.876432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.877231 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.877935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6mpn\" (UniqueName: \"kubernetes.io/projected/23e28440-3881-4d38-885e-3a20842b117d-kube-api-access-s6mpn\") pod \"glance-operator-controller-manager-784b5bb6c5-7h9cc\" (UID: \"23e28440-3881-4d38-885e-3a20842b117d\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.878001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccvbz\" (UniqueName: \"kubernetes.io/projected/12921b78-d19b-4a5c-be9e-5cf412b88186-kube-api-access-ccvbz\") pod \"barbican-operator-controller-manager-868647ff47-fn75h\" (UID: \"12921b78-d19b-4a5c-be9e-5cf412b88186\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.878023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfdd\" (UniqueName: \"kubernetes.io/projected/a8412981-280e-4153-b15e-7a5df751e110-kube-api-access-jkfdd\") pod \"designate-operator-controller-manager-6d8bf5c495-4jdng\" (UID: \"a8412981-280e-4153-b15e-7a5df751e110\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.878042 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dmz\" (UniqueName: \"kubernetes.io/projected/0d2964a7-7341-4f1f-ab51-b648ea057535-kube-api-access-74dmz\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jh8w\" (UID: \"0d2964a7-7341-4f1f-ab51-b648ea057535\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.891081 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.906268 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5fqn7" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.932615 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.982730 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v"] Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.985934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwj2\" (UniqueName: \"kubernetes.io/projected/dc40a3f9-e350-41dc-b13d-86ae3e46f551-kube-api-access-9kwj2\") pod \"heat-operator-controller-manager-69f49c598c-9tv6m\" (UID: \"dc40a3f9-e350-41dc-b13d-86ae3e46f551\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shfd\" (UniqueName: \"kubernetes.io/projected/0a7bf49d-b6d9-474b-b97c-cb555aa93f8a-kube-api-access-7shfd\") pod \"horizon-operator-controller-manager-5b9b8895d5-2tccs\" (UID: \"0a7bf49d-b6d9-474b-b97c-cb555aa93f8a\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk2mt\" (UniqueName: \"kubernetes.io/projected/17bb447b-e6f8-4a04-98ea-8559cbd26d34-kube-api-access-mk2mt\") pod \"keystone-operator-controller-manager-b4d948c87-zjvss\" (UID: \"17bb447b-e6f8-4a04-98ea-8559cbd26d34\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg64\" (UniqueName: \"kubernetes.io/projected/b1ea3312-6b21-440a-b617-5681d286bcc4-kube-api-access-fgg64\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6mpn\" (UniqueName: \"kubernetes.io/projected/23e28440-3881-4d38-885e-3a20842b117d-kube-api-access-s6mpn\") pod \"glance-operator-controller-manager-784b5bb6c5-7h9cc\" (UID: \"23e28440-3881-4d38-885e-3a20842b117d\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccvbz\" (UniqueName: \"kubernetes.io/projected/12921b78-d19b-4a5c-be9e-5cf412b88186-kube-api-access-ccvbz\") pod \"barbican-operator-controller-manager-868647ff47-fn75h\" (UID: \"12921b78-d19b-4a5c-be9e-5cf412b88186\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfdd\" (UniqueName: \"kubernetes.io/projected/a8412981-280e-4153-b15e-7a5df751e110-kube-api-access-jkfdd\") pod \"designate-operator-controller-manager-6d8bf5c495-4jdng\" (UID: \"a8412981-280e-4153-b15e-7a5df751e110\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dmz\" (UniqueName: \"kubernetes.io/projected/0d2964a7-7341-4f1f-ab51-b648ea057535-kube-api-access-74dmz\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jh8w\" (UID: \"0d2964a7-7341-4f1f-ab51-b648ea057535\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.987705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssn4h\" (UniqueName: \"kubernetes.io/projected/a8e8e364-0b06-4b3d-9faf-4c7c3c233060-kube-api-access-ssn4h\") pod \"ironic-operator-controller-manager-554564d7fc-5nbpj\" (UID: \"a8e8e364-0b06-4b3d-9faf-4c7c3c233060\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:09 crc kubenswrapper[4749]: I0225 07:34:09.990781 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5dvc4" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.028387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dmz\" (UniqueName: \"kubernetes.io/projected/0d2964a7-7341-4f1f-ab51-b648ea057535-kube-api-access-74dmz\") pod \"cinder-operator-controller-manager-55d77d7b5c-9jh8w\" (UID: \"0d2964a7-7341-4f1f-ab51-b648ea057535\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.030574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.030755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccvbz\" (UniqueName: \"kubernetes.io/projected/12921b78-d19b-4a5c-be9e-5cf412b88186-kube-api-access-ccvbz\") pod \"barbican-operator-controller-manager-868647ff47-fn75h\" (UID: \"12921b78-d19b-4a5c-be9e-5cf412b88186\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.034823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfdd\" (UniqueName: \"kubernetes.io/projected/a8412981-280e-4153-b15e-7a5df751e110-kube-api-access-jkfdd\") pod \"designate-operator-controller-manager-6d8bf5c495-4jdng\" (UID: \"a8412981-280e-4153-b15e-7a5df751e110\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.035087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.040026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6mpn\" (UniqueName: \"kubernetes.io/projected/23e28440-3881-4d38-885e-3a20842b117d-kube-api-access-s6mpn\") pod \"glance-operator-controller-manager-784b5bb6c5-7h9cc\" (UID: \"23e28440-3881-4d38-885e-3a20842b117d\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.048766 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.049791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.051910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-65kx4" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.087936 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.088888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwj2\" (UniqueName: \"kubernetes.io/projected/dc40a3f9-e350-41dc-b13d-86ae3e46f551-kube-api-access-9kwj2\") pod \"heat-operator-controller-manager-69f49c598c-9tv6m\" (UID: \"dc40a3f9-e350-41dc-b13d-86ae3e46f551\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.088927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shfd\" (UniqueName: \"kubernetes.io/projected/0a7bf49d-b6d9-474b-b97c-cb555aa93f8a-kube-api-access-7shfd\") pod \"horizon-operator-controller-manager-5b9b8895d5-2tccs\" (UID: \"0a7bf49d-b6d9-474b-b97c-cb555aa93f8a\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.088969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk2mt\" (UniqueName: \"kubernetes.io/projected/17bb447b-e6f8-4a04-98ea-8559cbd26d34-kube-api-access-mk2mt\") pod \"keystone-operator-controller-manager-b4d948c87-zjvss\" (UID: \"17bb447b-e6f8-4a04-98ea-8559cbd26d34\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.089013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg64\" (UniqueName: \"kubernetes.io/projected/b1ea3312-6b21-440a-b617-5681d286bcc4-kube-api-access-fgg64\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.089059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkcr\" (UniqueName: \"kubernetes.io/projected/0594f0f5-d0ce-4c43-8572-3dc16130152e-kube-api-access-cfkcr\") pod \"manila-operator-controller-manager-67d996989d-b2j2v\" (UID: \"0594f0f5-d0ce-4c43-8572-3dc16130152e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.089084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.089119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssn4h\" (UniqueName: \"kubernetes.io/projected/a8e8e364-0b06-4b3d-9faf-4c7c3c233060-kube-api-access-ssn4h\") pod \"ironic-operator-controller-manager-554564d7fc-5nbpj\" (UID: \"a8e8e364-0b06-4b3d-9faf-4c7c3c233060\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.089843 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.089880 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:10.589865764 +0000 UTC m=+1003.951691784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.099221 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.141938 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.144698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssn4h\" (UniqueName: \"kubernetes.io/projected/a8e8e364-0b06-4b3d-9faf-4c7c3c233060-kube-api-access-ssn4h\") pod \"ironic-operator-controller-manager-554564d7fc-5nbpj\" (UID: \"a8e8e364-0b06-4b3d-9faf-4c7c3c233060\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.147139 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.148160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.151315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwj2\" (UniqueName: \"kubernetes.io/projected/dc40a3f9-e350-41dc-b13d-86ae3e46f551-kube-api-access-9kwj2\") pod \"heat-operator-controller-manager-69f49c598c-9tv6m\" (UID: \"dc40a3f9-e350-41dc-b13d-86ae3e46f551\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.151434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.153978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lxw4h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.162511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk2mt\" (UniqueName: \"kubernetes.io/projected/17bb447b-e6f8-4a04-98ea-8559cbd26d34-kube-api-access-mk2mt\") pod \"keystone-operator-controller-manager-b4d948c87-zjvss\" (UID: \"17bb447b-e6f8-4a04-98ea-8559cbd26d34\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.169320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg64\" (UniqueName: \"kubernetes.io/projected/b1ea3312-6b21-440a-b617-5681d286bcc4-kube-api-access-fgg64\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.170746 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.171140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shfd\" (UniqueName: \"kubernetes.io/projected/0a7bf49d-b6d9-474b-b97c-cb555aa93f8a-kube-api-access-7shfd\") pod \"horizon-operator-controller-manager-5b9b8895d5-2tccs\" (UID: \"0a7bf49d-b6d9-474b-b97c-cb555aa93f8a\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.177458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.177769 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.178876 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8cjt5" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.181097 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.181160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.182372 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mfgsm" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.187084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.190465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhf9\" (UniqueName: \"kubernetes.io/projected/b394d64a-2cf2-4cad-9b51-adbf56cb696c-kube-api-access-hnhf9\") pod \"mariadb-operator-controller-manager-6994f66f48-g4h2l\" (UID: \"b394d64a-2cf2-4cad-9b51-adbf56cb696c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.190518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkcr\" (UniqueName: \"kubernetes.io/projected/0594f0f5-d0ce-4c43-8572-3dc16130152e-kube-api-access-cfkcr\") pod \"manila-operator-controller-manager-67d996989d-b2j2v\" (UID: \"0594f0f5-d0ce-4c43-8572-3dc16130152e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.195994 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.210333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.210570 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.213750 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wk5jc" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.214277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.224538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkcr\" (UniqueName: \"kubernetes.io/projected/0594f0f5-d0ce-4c43-8572-3dc16130152e-kube-api-access-cfkcr\") pod \"manila-operator-controller-manager-67d996989d-b2j2v\" (UID: \"0594f0f5-d0ce-4c43-8572-3dc16130152e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.225254 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.227070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.227576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.228146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.237741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.238002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bs59m" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.238112 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k9967" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.238359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.243786 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.257575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.265877 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.266723 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.273317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4jsfk" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.279917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.290523 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wv258"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.291434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.292999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7b6w\" (UniqueName: \"kubernetes.io/projected/b54d6e5b-b74d-4e25-808a-20383b1b02e0-kube-api-access-p7b6w\") pod \"neutron-operator-controller-manager-6bd4687957-sptf9\" (UID: \"b54d6e5b-b74d-4e25-808a-20383b1b02e0\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.293051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpssn\" (UniqueName: \"kubernetes.io/projected/c881d86e-332d-419d-8c8a-9b7dfafe8c3c-kube-api-access-vpssn\") pod \"nova-operator-controller-manager-567668f5cf-q64h7\" (UID: \"c881d86e-332d-419d-8c8a-9b7dfafe8c3c\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.293075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4827\" (UniqueName: \"kubernetes.io/projected/9fcf9db9-6abb-445d-aa8c-8d5e60431838-kube-api-access-j4827\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lrvt\" (UID: \"9fcf9db9-6abb-445d-aa8c-8d5e60431838\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.293101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhf9\" (UniqueName: \"kubernetes.io/projected/b394d64a-2cf2-4cad-9b51-adbf56cb696c-kube-api-access-hnhf9\") pod \"mariadb-operator-controller-manager-6994f66f48-g4h2l\" (UID: \"b394d64a-2cf2-4cad-9b51-adbf56cb696c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.296563 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rgmwm" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.299734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wv258"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.309623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.312061 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.316527 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhf9\" (UniqueName: \"kubernetes.io/projected/b394d64a-2cf2-4cad-9b51-adbf56cb696c-kube-api-access-hnhf9\") pod \"mariadb-operator-controller-manager-6994f66f48-g4h2l\" (UID: \"b394d64a-2cf2-4cad-9b51-adbf56cb696c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.318486 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.322483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vxph9" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.359952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdhld\" (UniqueName: \"kubernetes.io/projected/6dc6924f-3556-4ac1-b787-57fa3e20297f-kube-api-access-hdhld\") pod \"telemetry-operator-controller-manager-589c568786-wv258\" (UID: \"6dc6924f-3556-4ac1-b787-57fa3e20297f\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr9rf\" (UniqueName: \"kubernetes.io/projected/e1c9da55-b9bc-4bb7-b73a-ca73c928f333-kube-api-access-wr9rf\") pod \"ovn-operator-controller-manager-5955d8c787-n4v8h\" (UID: \"e1c9da55-b9bc-4bb7-b73a-ca73c928f333\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7b6w\" (UniqueName: \"kubernetes.io/projected/b54d6e5b-b74d-4e25-808a-20383b1b02e0-kube-api-access-p7b6w\") pod \"neutron-operator-controller-manager-6bd4687957-sptf9\" (UID: \"b54d6e5b-b74d-4e25-808a-20383b1b02e0\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zth9\" (UniqueName: \"kubernetes.io/projected/e1152a21-82e3-4a7f-92c8-8633abeecb26-kube-api-access-7zth9\") pod \"swift-operator-controller-manager-68f46476f-nqhfv\" (UID: \"e1152a21-82e3-4a7f-92c8-8633abeecb26\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpssn\" (UniqueName: \"kubernetes.io/projected/c881d86e-332d-419d-8c8a-9b7dfafe8c3c-kube-api-access-vpssn\") pod \"nova-operator-controller-manager-567668f5cf-q64h7\" (UID: \"c881d86e-332d-419d-8c8a-9b7dfafe8c3c\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsttw\" (UniqueName: \"kubernetes.io/projected/6f55e1ee-705c-409e-b34a-77232bf089eb-kube-api-access-dsttw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4827\" (UniqueName: \"kubernetes.io/projected/9fcf9db9-6abb-445d-aa8c-8d5e60431838-kube-api-access-j4827\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lrvt\" (UID: \"9fcf9db9-6abb-445d-aa8c-8d5e60431838\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.396872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbj8\" (UniqueName: \"kubernetes.io/projected/9bae09cb-1d81-4971-82bc-84d87a6dca77-kube-api-access-nxbj8\") pod \"placement-operator-controller-manager-8497b45c89-5gt8h\" (UID: \"9bae09cb-1d81-4971-82bc-84d87a6dca77\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.424738 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.442855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpssn\" (UniqueName: \"kubernetes.io/projected/c881d86e-332d-419d-8c8a-9b7dfafe8c3c-kube-api-access-vpssn\") pod \"nova-operator-controller-manager-567668f5cf-q64h7\" (UID: \"c881d86e-332d-419d-8c8a-9b7dfafe8c3c\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.443839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.444610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.445428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4827\" (UniqueName: \"kubernetes.io/projected/9fcf9db9-6abb-445d-aa8c-8d5e60431838-kube-api-access-j4827\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lrvt\" (UID: \"9fcf9db9-6abb-445d-aa8c-8d5e60431838\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.466204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7b6w\" (UniqueName: \"kubernetes.io/projected/b54d6e5b-b74d-4e25-808a-20383b1b02e0-kube-api-access-p7b6w\") pod \"neutron-operator-controller-manager-6bd4687957-sptf9\" (UID: \"b54d6e5b-b74d-4e25-808a-20383b1b02e0\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.473872 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.485774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.487615 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.488103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-zqvdv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.501466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsttw\" (UniqueName: \"kubernetes.io/projected/6f55e1ee-705c-409e-b34a-77232bf089eb-kube-api-access-dsttw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.501545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbj8\" (UniqueName: \"kubernetes.io/projected/9bae09cb-1d81-4971-82bc-84d87a6dca77-kube-api-access-nxbj8\") pod \"placement-operator-controller-manager-8497b45c89-5gt8h\" (UID: \"9bae09cb-1d81-4971-82bc-84d87a6dca77\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.501577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdhld\" (UniqueName: \"kubernetes.io/projected/6dc6924f-3556-4ac1-b787-57fa3e20297f-kube-api-access-hdhld\") pod \"telemetry-operator-controller-manager-589c568786-wv258\" (UID: \"6dc6924f-3556-4ac1-b787-57fa3e20297f\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.501630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbjq\" (UniqueName: \"kubernetes.io/projected/0ada39b5-592e-448e-9a12-2f9e95906a74-kube-api-access-6nbjq\") pod \"test-operator-controller-manager-5dc6794d5b-wh7zn\" (UID: \"0ada39b5-592e-448e-9a12-2f9e95906a74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.502356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.502448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr9rf\" (UniqueName: \"kubernetes.io/projected/e1c9da55-b9bc-4bb7-b73a-ca73c928f333-kube-api-access-wr9rf\") pod \"ovn-operator-controller-manager-5955d8c787-n4v8h\" (UID: \"e1c9da55-b9bc-4bb7-b73a-ca73c928f333\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.502549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zth9\" (UniqueName: \"kubernetes.io/projected/e1152a21-82e3-4a7f-92c8-8633abeecb26-kube-api-access-7zth9\") pod \"swift-operator-controller-manager-68f46476f-nqhfv\" (UID: \"e1152a21-82e3-4a7f-92c8-8633abeecb26\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.502999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.510616 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.510706 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:11.010679176 +0000 UTC m=+1004.372505276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.516156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.529086 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.533439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbj8\" (UniqueName: \"kubernetes.io/projected/9bae09cb-1d81-4971-82bc-84d87a6dca77-kube-api-access-nxbj8\") pod \"placement-operator-controller-manager-8497b45c89-5gt8h\" (UID: \"9bae09cb-1d81-4971-82bc-84d87a6dca77\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.538357 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.539029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr9rf\" (UniqueName: \"kubernetes.io/projected/e1c9da55-b9bc-4bb7-b73a-ca73c928f333-kube-api-access-wr9rf\") pod \"ovn-operator-controller-manager-5955d8c787-n4v8h\" (UID: \"e1c9da55-b9bc-4bb7-b73a-ca73c928f333\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.539250 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zth9\" (UniqueName: \"kubernetes.io/projected/e1152a21-82e3-4a7f-92c8-8633abeecb26-kube-api-access-7zth9\") pod \"swift-operator-controller-manager-68f46476f-nqhfv\" (UID: \"e1152a21-82e3-4a7f-92c8-8633abeecb26\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.539460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsttw\" (UniqueName: \"kubernetes.io/projected/6f55e1ee-705c-409e-b34a-77232bf089eb-kube-api-access-dsttw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.545331 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdhld\" (UniqueName: \"kubernetes.io/projected/6dc6924f-3556-4ac1-b787-57fa3e20297f-kube-api-access-hdhld\") pod \"telemetry-operator-controller-manager-589c568786-wv258\" (UID: \"6dc6924f-3556-4ac1-b787-57fa3e20297f\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.575762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.576658 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.577684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.579104 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.579115 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.580094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-756xx" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.604853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.604921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/6b0526d5-8f6f-47d4-87f5-0deb2c091848-kube-api-access-7j7nj\") pod \"watcher-operator-controller-manager-bccc79885-7fzsq\" (UID: \"6b0526d5-8f6f-47d4-87f5-0deb2c091848\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.604997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbjq\" (UniqueName: \"kubernetes.io/projected/0ada39b5-592e-448e-9a12-2f9e95906a74-kube-api-access-6nbjq\") pod \"test-operator-controller-manager-5dc6794d5b-wh7zn\" (UID: \"0ada39b5-592e-448e-9a12-2f9e95906a74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.605307 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.605352 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:11.605339329 +0000 UTC m=+1004.967165349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.606684 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.610083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.620056 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.621505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.627041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-nr7rd" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.628247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbjq\" (UniqueName: \"kubernetes.io/projected/0ada39b5-592e-448e-9a12-2f9e95906a74-kube-api-access-6nbjq\") pod \"test-operator-controller-manager-5dc6794d5b-wh7zn\" (UID: \"0ada39b5-592e-448e-9a12-2f9e95906a74\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.632468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.644323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.654851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.659588 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.674916 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.706153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.706192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.706212 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dvj\" (UniqueName: \"kubernetes.io/projected/37a9e86e-0ee0-4447-910a-a185f4681508-kube-api-access-w2dvj\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.706498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/6b0526d5-8f6f-47d4-87f5-0deb2c091848-kube-api-access-7j7nj\") pod \"watcher-operator-controller-manager-bccc79885-7fzsq\" (UID: \"6b0526d5-8f6f-47d4-87f5-0deb2c091848\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.725906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7nj\" (UniqueName: \"kubernetes.io/projected/6b0526d5-8f6f-47d4-87f5-0deb2c091848-kube-api-access-7j7nj\") pod \"watcher-operator-controller-manager-bccc79885-7fzsq\" (UID: \"6b0526d5-8f6f-47d4-87f5-0deb2c091848\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.809537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52q5\" (UniqueName: \"kubernetes.io/projected/d3e727f4-f059-41f3-94d1-fef7a644f2b2-kube-api-access-x52q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vx8bh\" (UID: \"d3e727f4-f059-41f3-94d1-fef7a644f2b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.809615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.809645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.809661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dvj\" (UniqueName: \"kubernetes.io/projected/37a9e86e-0ee0-4447-910a-a185f4681508-kube-api-access-w2dvj\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.809923 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.810016 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:11.309992436 +0000 UTC m=+1004.671818456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.809923 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: E0225 07:34:10.810137 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:11.310109159 +0000 UTC m=+1004.671935179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.826964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dvj\" (UniqueName: \"kubernetes.io/projected/37a9e86e-0ee0-4447-910a-a185f4681508-kube-api-access-w2dvj\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.840936 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.879473 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.895536 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.899298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" event={"ID":"a8412981-280e-4153-b15e-7a5df751e110","Type":"ContainerStarted","Data":"ffc7cbee57dbee9e7866ddaa45005e031c728b44ef869c4ff05dfb457dca5f58"} Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.911158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x52q5\" (UniqueName: \"kubernetes.io/projected/d3e727f4-f059-41f3-94d1-fef7a644f2b2-kube-api-access-x52q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vx8bh\" (UID: \"d3e727f4-f059-41f3-94d1-fef7a644f2b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.913457 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w"] Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.946213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52q5\" (UniqueName: \"kubernetes.io/projected/d3e727f4-f059-41f3-94d1-fef7a644f2b2-kube-api-access-x52q5\") pod \"rabbitmq-cluster-operator-manager-668c99d594-vx8bh\" (UID: \"d3e727f4-f059-41f3-94d1-fef7a644f2b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" Feb 25 07:34:10 crc kubenswrapper[4749]: I0225 07:34:10.946763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" Feb 25 07:34:10 crc kubenswrapper[4749]: W0225 07:34:10.953559 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d2964a7_7341_4f1f_ab51_b648ea057535.slice/crio-93ad288f8ca84b36e3c053a05b294f149f17530d80a1f0a9e59a6125ef663377 WatchSource:0}: Error finding container 93ad288f8ca84b36e3c053a05b294f149f17530d80a1f0a9e59a6125ef663377: Status 404 returned error can't find the container with id 93ad288f8ca84b36e3c053a05b294f149f17530d80a1f0a9e59a6125ef663377 Feb 25 07:34:10 crc kubenswrapper[4749]: W0225 07:34:10.991223 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e28440_3881_4d38_885e_3a20842b117d.slice/crio-4287e2b6dd2db7164304951f767c910ee03f3327f817e39ef19f30f4e5e22aa9 WatchSource:0}: Error finding container 4287e2b6dd2db7164304951f767c910ee03f3327f817e39ef19f30f4e5e22aa9: Status 404 returned error can't find the container with id 4287e2b6dd2db7164304951f767c910ee03f3327f817e39ef19f30f4e5e22aa9 Feb 25 07:34:10 crc kubenswrapper[4749]: W0225 07:34:10.997926 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e8e364_0b06_4b3d_9faf_4c7c3c233060.slice/crio-132858ec347d5d5faaa33b1089c3c7baf023b1a815ddff45991fcbe8e094611f WatchSource:0}: Error finding container 132858ec347d5d5faaa33b1089c3c7baf023b1a815ddff45991fcbe8e094611f: Status 404 returned error can't find the container with id 132858ec347d5d5faaa33b1089c3c7baf023b1a815ddff45991fcbe8e094611f Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.011909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.012090 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.012139 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:12.012124412 +0000 UTC m=+1005.373950432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.035663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.047075 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.178730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.302702 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.313458 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.318105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.318152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.318305 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.318360 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:12.318338099 +0000 UTC m=+1005.680164119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.324838 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.324921 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:12.324902776 +0000 UTC m=+1005.686728786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.350230 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.350806 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt"] Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.387215 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.410451 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb394d64a_2cf2_4cad_9b51_adbf56cb696c.slice/crio-3ecd2a4ece8965f1c5823399fa9249959d1859906623d6bdbcb672ae6905fafb WatchSource:0}: Error finding container 3ecd2a4ece8965f1c5823399fa9249959d1859906623d6bdbcb672ae6905fafb: Status 404 returned error can't find the container with id 3ecd2a4ece8965f1c5823399fa9249959d1859906623d6bdbcb672ae6905fafb Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.542334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.567069 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bae09cb_1d81_4971_82bc_84d87a6dca77.slice/crio-9cd58c08dfc5bc906bcf0e1b23ea46f17504c55cf7afcabf739598ebfa6cd217 WatchSource:0}: Error finding container 9cd58c08dfc5bc906bcf0e1b23ea46f17504c55cf7afcabf739598ebfa6cd217: Status 404 returned error can't find the container with id 9cd58c08dfc5bc906bcf0e1b23ea46f17504c55cf7afcabf739598ebfa6cd217 Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.575868 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.582639 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c9da55_b9bc_4bb7_b73a_ca73c928f333.slice/crio-de1a20509bf52f28132fba0137b962783a765522d247de61d3a9aef8125196b9 WatchSource:0}: Error finding container de1a20509bf52f28132fba0137b962783a765522d247de61d3a9aef8125196b9: Status 404 returned error can't find the container with id de1a20509bf52f28132fba0137b962783a765522d247de61d3a9aef8125196b9 Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.588310 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.595871 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb54d6e5b_b74d_4e25_808a_20383b1b02e0.slice/crio-969b41b7a8195d643bdd255012eeadb22c269028f280f085a299d98b8b8ca514 WatchSource:0}: Error finding container 969b41b7a8195d643bdd255012eeadb22c269028f280f085a299d98b8b8ca514: Status 404 returned error can't find the container with id 969b41b7a8195d643bdd255012eeadb22c269028f280f085a299d98b8b8ca514 Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.596668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9"] Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.599248 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zth9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-nqhfv_openstack-operators(e1152a21-82e3-4a7f-92c8-8633abeecb26): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.600478 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" podUID="e1152a21-82e3-4a7f-92c8-8633abeecb26" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.602689 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7b6w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-sptf9_openstack-operators(b54d6e5b-b74d-4e25-808a-20383b1b02e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.604422 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" podUID="b54d6e5b-b74d-4e25-808a-20383b1b02e0" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.622057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.622229 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.622293 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:13.622277551 +0000 UTC m=+1006.984103571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.662429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-wv258"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.674973 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc6924f_3556_4ac1_b787_57fa3e20297f.slice/crio-aedc335bdf200df77326b79927f94e86e9a5ff90265e10926c96602ae7f3a47e WatchSource:0}: Error finding container aedc335bdf200df77326b79927f94e86e9a5ff90265e10926c96602ae7f3a47e: Status 404 returned error can't find the container with id aedc335bdf200df77326b79927f94e86e9a5ff90265e10926c96602ae7f3a47e Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.680444 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn"] Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.684620 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hdhld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-589c568786-wv258_openstack-operators(6dc6924f-3556-4ac1-b787-57fa3e20297f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.686024 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" podUID="6dc6924f-3556-4ac1-b787-57fa3e20297f" Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.696280 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ada39b5_592e_448e_9a12_2f9e95906a74.slice/crio-723c2a18884ef0358459ad328d90154aa03662a9d3d5283171a9d73c9e0fb5d7 WatchSource:0}: Error finding container 723c2a18884ef0358459ad328d90154aa03662a9d3d5283171a9d73c9e0fb5d7: Status 404 returned error can't find the container with id 723c2a18884ef0358459ad328d90154aa03662a9d3d5283171a9d73c9e0fb5d7 Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.705361 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nbjq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5dc6794d5b-wh7zn_openstack-operators(0ada39b5-592e-448e-9a12-2f9e95906a74): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.708335 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" podUID="0ada39b5-592e-448e-9a12-2f9e95906a74" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.812658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh"] Feb 25 07:34:11 crc kubenswrapper[4749]: W0225 07:34:11.832283 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e727f4_f059_41f3_94d1_fef7a644f2b2.slice/crio-61e34846e8ac1e0db1d56014e980b1be6103f691e23745a6f1ca5e0db8e6322e WatchSource:0}: Error finding container 61e34846e8ac1e0db1d56014e980b1be6103f691e23745a6f1ca5e0db8e6322e: Status 404 returned error can't find the container with id 61e34846e8ac1e0db1d56014e980b1be6103f691e23745a6f1ca5e0db8e6322e Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.834422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq"] Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.846319 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7j7nj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-7fzsq_openstack-operators(6b0526d5-8f6f-47d4-87f5-0deb2c091848): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.848213 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" podUID="6b0526d5-8f6f-47d4-87f5-0deb2c091848" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.910381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" event={"ID":"0594f0f5-d0ce-4c43-8572-3dc16130152e","Type":"ContainerStarted","Data":"848735111d1640be73d8804a37708b09ee3bbaac35c5d04ed14f7a63aeae6494"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.911444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" event={"ID":"6b0526d5-8f6f-47d4-87f5-0deb2c091848","Type":"ContainerStarted","Data":"ccbe9aed39cfa94b96462d65696bfa005956ce582fc73bdd11229e2cfc2cb3b7"} Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.912845 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" podUID="6b0526d5-8f6f-47d4-87f5-0deb2c091848" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.914487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" event={"ID":"12921b78-d19b-4a5c-be9e-5cf412b88186","Type":"ContainerStarted","Data":"c66c5278e0f5cc1017aeaf8d107e5325dfb9b3fbbf0e928726fb4e3908267db7"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.922047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" event={"ID":"6dc6924f-3556-4ac1-b787-57fa3e20297f","Type":"ContainerStarted","Data":"aedc335bdf200df77326b79927f94e86e9a5ff90265e10926c96602ae7f3a47e"} Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.933384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" podUID="6dc6924f-3556-4ac1-b787-57fa3e20297f" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.935299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" event={"ID":"b394d64a-2cf2-4cad-9b51-adbf56cb696c","Type":"ContainerStarted","Data":"3ecd2a4ece8965f1c5823399fa9249959d1859906623d6bdbcb672ae6905fafb"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.944771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" event={"ID":"a8e8e364-0b06-4b3d-9faf-4c7c3c233060","Type":"ContainerStarted","Data":"132858ec347d5d5faaa33b1089c3c7baf023b1a815ddff45991fcbe8e094611f"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.950662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" event={"ID":"17bb447b-e6f8-4a04-98ea-8559cbd26d34","Type":"ContainerStarted","Data":"6f605802cf2a086848c811f7e9cbf8bb6b1a5e9b796dfa6f40fc0cc94c8ed2af"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.952761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" event={"ID":"b54d6e5b-b74d-4e25-808a-20383b1b02e0","Type":"ContainerStarted","Data":"969b41b7a8195d643bdd255012eeadb22c269028f280f085a299d98b8b8ca514"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.954863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" event={"ID":"0a7bf49d-b6d9-474b-b97c-cb555aa93f8a","Type":"ContainerStarted","Data":"5bd2fd78c6f6ab9d7fa596acc98238c4bf0a7758443deedda7da88d1315db2ca"} Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.955175 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" podUID="b54d6e5b-b74d-4e25-808a-20383b1b02e0" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.956117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" event={"ID":"0ada39b5-592e-448e-9a12-2f9e95906a74","Type":"ContainerStarted","Data":"723c2a18884ef0358459ad328d90154aa03662a9d3d5283171a9d73c9e0fb5d7"} Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.957105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" podUID="0ada39b5-592e-448e-9a12-2f9e95906a74" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.957936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" event={"ID":"23e28440-3881-4d38-885e-3a20842b117d","Type":"ContainerStarted","Data":"4287e2b6dd2db7164304951f767c910ee03f3327f817e39ef19f30f4e5e22aa9"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.959463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" event={"ID":"0d2964a7-7341-4f1f-ab51-b648ea057535","Type":"ContainerStarted","Data":"93ad288f8ca84b36e3c053a05b294f149f17530d80a1f0a9e59a6125ef663377"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.961432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" event={"ID":"e1c9da55-b9bc-4bb7-b73a-ca73c928f333","Type":"ContainerStarted","Data":"de1a20509bf52f28132fba0137b962783a765522d247de61d3a9aef8125196b9"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.966116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" event={"ID":"9fcf9db9-6abb-445d-aa8c-8d5e60431838","Type":"ContainerStarted","Data":"df3dbceb6dcdb945548243d52003503d094b50de19995126b45b55b46e9a84e5"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.967221 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" event={"ID":"d3e727f4-f059-41f3-94d1-fef7a644f2b2","Type":"ContainerStarted","Data":"61e34846e8ac1e0db1d56014e980b1be6103f691e23745a6f1ca5e0db8e6322e"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.970293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" event={"ID":"9bae09cb-1d81-4971-82bc-84d87a6dca77","Type":"ContainerStarted","Data":"9cd58c08dfc5bc906bcf0e1b23ea46f17504c55cf7afcabf739598ebfa6cd217"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.974095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" event={"ID":"e1152a21-82e3-4a7f-92c8-8633abeecb26","Type":"ContainerStarted","Data":"49c7c92e7be92d0cedf652b80dc3836b487a80ded1aa3663529d468ce16970a6"} Feb 25 07:34:11 crc kubenswrapper[4749]: E0225 07:34:11.980536 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" podUID="e1152a21-82e3-4a7f-92c8-8633abeecb26" Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.980558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" event={"ID":"c881d86e-332d-419d-8c8a-9b7dfafe8c3c","Type":"ContainerStarted","Data":"ebdb4ee3d50d4ccd9794040b6052f84441c4a7323c00f11b37053f43a0284c5d"} Feb 25 07:34:11 crc kubenswrapper[4749]: I0225 07:34:11.982623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" event={"ID":"dc40a3f9-e350-41dc-b13d-86ae3e46f551","Type":"ContainerStarted","Data":"b6c0b161c3113e699b9aeab2e4f173cf2e6a4f1c4c0b92bf895a1841256dbe43"} Feb 25 07:34:12 crc kubenswrapper[4749]: I0225 07:34:12.036675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.037200 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.037266 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:14.03724233 +0000 UTC m=+1007.399068380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:12 crc kubenswrapper[4749]: I0225 07:34:12.340754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:12 crc kubenswrapper[4749]: I0225 07:34:12.341014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.340900 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.341135 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.341173 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:14.341136281 +0000 UTC m=+1007.702962301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:12 crc kubenswrapper[4749]: E0225 07:34:12.341192 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:14.341183992 +0000 UTC m=+1007.703010002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.009292 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" podUID="e1152a21-82e3-4a7f-92c8-8633abeecb26" Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.009292 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4eb8fab5530a08915d3ab3e11e2808aeae16c8a220ed34ee04a186b2ae2303dc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" podUID="6dc6924f-3556-4ac1-b787-57fa3e20297f" Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.013975 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:38e6a5bd24ab1684f22a64186fe99a7cdc7897eb7feb715ec1704eea7596dd98\\\"\"" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" podUID="0ada39b5-592e-448e-9a12-2f9e95906a74" Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.014060 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" podUID="b54d6e5b-b74d-4e25-808a-20383b1b02e0" Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.014316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" podUID="6b0526d5-8f6f-47d4-87f5-0deb2c091848" Feb 25 07:34:13 crc kubenswrapper[4749]: I0225 07:34:13.666958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.667098 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:13 crc kubenswrapper[4749]: E0225 07:34:13.667351 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:17.667331742 +0000 UTC m=+1011.029157762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: I0225 07:34:14.072926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.073095 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.073242 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:18.073225764 +0000 UTC m=+1011.435051784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: I0225 07:34:14.377396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:14 crc kubenswrapper[4749]: I0225 07:34:14.377536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.377565 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.377663 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.377668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:18.377646768 +0000 UTC m=+1011.739472808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:14 crc kubenswrapper[4749]: E0225 07:34:14.377725 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:18.377708059 +0000 UTC m=+1011.739534079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:17 crc kubenswrapper[4749]: I0225 07:34:17.730643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:17 crc kubenswrapper[4749]: E0225 07:34:17.730840 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:17 crc kubenswrapper[4749]: E0225 07:34:17.730926 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:25.730902948 +0000 UTC m=+1019.092729048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: I0225 07:34:18.137090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.137309 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.137674 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:26.137645849 +0000 UTC m=+1019.499471879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: I0225 07:34:18.440624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:18 crc kubenswrapper[4749]: I0225 07:34:18.440709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.442462 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.442512 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:26.442498263 +0000 UTC m=+1019.804324283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.443112 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:18 crc kubenswrapper[4749]: E0225 07:34:18.443144 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:26.443135968 +0000 UTC m=+1019.804961988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:21 crc kubenswrapper[4749]: I0225 07:34:21.672334 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:34:21 crc kubenswrapper[4749]: I0225 07:34:21.672716 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:34:23 crc kubenswrapper[4749]: E0225 07:34:23.494054 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 25 07:34:23 crc kubenswrapper[4749]: E0225 07:34:23.494239 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7shfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-2tccs_openstack-operators(0a7bf49d-b6d9-474b-b97c-cb555aa93f8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:23 crc kubenswrapper[4749]: E0225 07:34:23.495421 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" podUID="0a7bf49d-b6d9-474b-b97c-cb555aa93f8a" Feb 25 07:34:24 crc kubenswrapper[4749]: E0225 07:34:24.090226 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" podUID="0a7bf49d-b6d9-474b-b97c-cb555aa93f8a" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.080061 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.080257 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x52q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-vx8bh_openstack-operators(d3e727f4-f059-41f3-94d1-fef7a644f2b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.081523 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" podUID="d3e727f4-f059-41f3-94d1-fef7a644f2b2" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.102241 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" podUID="d3e727f4-f059-41f3-94d1-fef7a644f2b2" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.624263 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.624467 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxbj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-5gt8h_openstack-operators(9bae09cb-1d81-4971-82bc-84d87a6dca77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.625673 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" podUID="9bae09cb-1d81-4971-82bc-84d87a6dca77" Feb 25 07:34:25 crc kubenswrapper[4749]: I0225 07:34:25.764340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.764489 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:25 crc kubenswrapper[4749]: E0225 07:34:25.764535 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert podName:b1ea3312-6b21-440a-b617-5681d286bcc4 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:41.76452215 +0000 UTC m=+1035.126348170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert") pod "infra-operator-controller-manager-79d975b745-rtn6b" (UID: "b1ea3312-6b21-440a-b617-5681d286bcc4") : secret "infra-operator-webhook-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.103715 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" podUID="9bae09cb-1d81-4971-82bc-84d87a6dca77" Feb 25 07:34:26 crc kubenswrapper[4749]: I0225 07:34:26.170064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.170221 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.170270 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert podName:6f55e1ee-705c-409e-b34a-77232bf089eb nodeName:}" failed. No retries permitted until 2026-02-25 07:34:42.170254228 +0000 UTC m=+1035.532080248 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" (UID: "6f55e1ee-705c-409e-b34a-77232bf089eb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: I0225 07:34:26.473181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:26 crc kubenswrapper[4749]: I0225 07:34:26.473235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.473372 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.473479 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:42.473452752 +0000 UTC m=+1035.835278812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "webhook-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.473523 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 25 07:34:26 crc kubenswrapper[4749]: E0225 07:34:26.473694 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs podName:37a9e86e-0ee0-4447-910a-a185f4681508 nodeName:}" failed. No retries permitted until 2026-02-25 07:34:42.473663507 +0000 UTC m=+1035.835489557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs") pod "openstack-operator-controller-manager-6699bbbd4-8bgxl" (UID: "37a9e86e-0ee0-4447-910a-a185f4681508") : secret "metrics-server-cert" not found Feb 25 07:34:32 crc kubenswrapper[4749]: E0225 07:34:32.255903 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3" Feb 25 07:34:32 crc kubenswrapper[4749]: E0225 07:34:32.256658 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-74dmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-9jh8w_openstack-operators(0d2964a7-7341-4f1f-ab51-b648ea057535): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:32 crc kubenswrapper[4749]: E0225 07:34:32.258234 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" podUID="0d2964a7-7341-4f1f-ab51-b648ea057535" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.158358 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" podUID="0d2964a7-7341-4f1f-ab51-b648ea057535" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.258015 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.258212 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mk2mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-zjvss_openstack-operators(17bb447b-e6f8-4a04-98ea-8559cbd26d34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.259385 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" podUID="17bb447b-e6f8-4a04-98ea-8559cbd26d34" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.943614 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.943781 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vpssn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-q64h7_openstack-operators(c881d86e-332d-419d-8c8a-9b7dfafe8c3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:34:33 crc kubenswrapper[4749]: E0225 07:34:33.944973 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" podUID="c881d86e-332d-419d-8c8a-9b7dfafe8c3c" Feb 25 07:34:34 crc kubenswrapper[4749]: E0225 07:34:34.165384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" podUID="17bb447b-e6f8-4a04-98ea-8559cbd26d34" Feb 25 07:34:34 crc kubenswrapper[4749]: E0225 07:34:34.165405 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" podUID="c881d86e-332d-419d-8c8a-9b7dfafe8c3c" Feb 25 07:34:34 crc kubenswrapper[4749]: I0225 07:34:34.712945 4749 scope.go:117] "RemoveContainer" containerID="1656393c0ad300dce187b340e77b5ac6c9f979b63d1cbb70310968d5e5a2cad4" Feb 25 07:34:36 crc kubenswrapper[4749]: I0225 07:34:36.191564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" event={"ID":"b394d64a-2cf2-4cad-9b51-adbf56cb696c","Type":"ContainerStarted","Data":"50aaf3040e392ec7edf564b9608f0c61ea8dff8fb0f9d97c4de6702451be1f27"} Feb 25 07:34:36 crc kubenswrapper[4749]: I0225 07:34:36.191886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:36 crc kubenswrapper[4749]: I0225 07:34:36.206954 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" podStartSLOduration=4.703508526 podStartE2EDuration="27.206938591s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.413878913 +0000 UTC m=+1004.775704933" lastFinishedPulling="2026-02-25 07:34:33.917308978 +0000 UTC m=+1027.279134998" observedRunningTime="2026-02-25 07:34:36.205718732 +0000 UTC m=+1029.567544752" watchObservedRunningTime="2026-02-25 07:34:36.206938591 +0000 UTC m=+1029.568764611" Feb 25 07:34:37 crc kubenswrapper[4749]: I0225 07:34:37.213754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" event={"ID":"a8412981-280e-4153-b15e-7a5df751e110","Type":"ContainerStarted","Data":"7ff24c60c323841450495495a5512d1bd5f2472218ea95abf625315916b46753"} Feb 25 07:34:37 crc kubenswrapper[4749]: I0225 07:34:37.227664 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" podStartSLOduration=4.994298871 podStartE2EDuration="28.227645484s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:10.684097999 +0000 UTC m=+1004.045924019" lastFinishedPulling="2026-02-25 07:34:33.917444612 +0000 UTC m=+1027.279270632" observedRunningTime="2026-02-25 07:34:37.225585844 +0000 UTC m=+1030.587411884" watchObservedRunningTime="2026-02-25 07:34:37.227645484 +0000 UTC m=+1030.589471504" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.221499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" event={"ID":"12921b78-d19b-4a5c-be9e-5cf412b88186","Type":"ContainerStarted","Data":"76003cedc59ee86f404a35b06f50f9f0420f0ac6716d3559c66806ee8152e4fa"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.221571 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.222804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" event={"ID":"6dc6924f-3556-4ac1-b787-57fa3e20297f","Type":"ContainerStarted","Data":"12bfcf90eece754dfdea903d0756ff7d35abd728f15af36cce14fe00011df6cf"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.223163 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.224634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" event={"ID":"6b0526d5-8f6f-47d4-87f5-0deb2c091848","Type":"ContainerStarted","Data":"44bfacdad921bbb4c7f89a5c2c342bc3cba067e61bf3d01e565082287986a18f"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.225160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.227073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" event={"ID":"b54d6e5b-b74d-4e25-808a-20383b1b02e0","Type":"ContainerStarted","Data":"d87184e115af36d5eec8cf140ce8cb9b51e1e046e607e25b796ac873b1c732b2"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.227287 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.228573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" event={"ID":"a8e8e364-0b06-4b3d-9faf-4c7c3c233060","Type":"ContainerStarted","Data":"96e0e95593a048484b8345dc9c0c7d4a635691c3f91fc105c770f51cb25182f3"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.232163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" event={"ID":"dc40a3f9-e350-41dc-b13d-86ae3e46f551","Type":"ContainerStarted","Data":"da4ef89dee76ea944c2a102f2a2867ada8afb01fe99d95b214c51dde58ade158"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.232285 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.233658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" event={"ID":"9fcf9db9-6abb-445d-aa8c-8d5e60431838","Type":"ContainerStarted","Data":"52a95d8e423c43311e843d8fe7bc9be190977b3586e766e1820c6a4b2dfc52d0"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.233803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.235152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" event={"ID":"0ada39b5-592e-448e-9a12-2f9e95906a74","Type":"ContainerStarted","Data":"d06d3de878c9990628606f8496484c1e39b00cef22bc876ad2cbdc8169bb8a7e"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.235367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.236560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" event={"ID":"0594f0f5-d0ce-4c43-8572-3dc16130152e","Type":"ContainerStarted","Data":"663c5ada49ebb5541ece63e8fec55e4cb917b3d4845e5b269f901d458dc4ffa7"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.236782 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.237820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" event={"ID":"23e28440-3881-4d38-885e-3a20842b117d","Type":"ContainerStarted","Data":"d8e3cc31f8f83ff40daad8e13f6ced526b3dcc010aabe6917fe1f9728d3fc25a"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.237953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.239202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" event={"ID":"e1152a21-82e3-4a7f-92c8-8633abeecb26","Type":"ContainerStarted","Data":"9f158b41e7a830c516f0212de3310658d25c355927471c05ea04cbc6674cb748"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.239446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.240821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" event={"ID":"e1c9da55-b9bc-4bb7-b73a-ca73c928f333","Type":"ContainerStarted","Data":"88832a3b2d2bc18541c8828db1263faf6db3fa44e48110ca0eec6a218c3eabe2"} Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.241090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.241121 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.251313 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" podStartSLOduration=5.998261251 podStartE2EDuration="29.251299808s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.207829093 +0000 UTC m=+1004.569655113" lastFinishedPulling="2026-02-25 07:34:34.46086764 +0000 UTC m=+1027.822693670" observedRunningTime="2026-02-25 07:34:38.24932992 +0000 UTC m=+1031.611155940" watchObservedRunningTime="2026-02-25 07:34:38.251299808 +0000 UTC m=+1031.613125828" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.275584 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" podStartSLOduration=3.310659897 podStartE2EDuration="29.275566573s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.602556925 +0000 UTC m=+1004.964382945" lastFinishedPulling="2026-02-25 07:34:37.567463591 +0000 UTC m=+1030.929289621" observedRunningTime="2026-02-25 07:34:38.274072977 +0000 UTC m=+1031.635898997" watchObservedRunningTime="2026-02-25 07:34:38.275566573 +0000 UTC m=+1031.637392593" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.309151 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" podStartSLOduration=5.844403318 podStartE2EDuration="29.309133543s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:10.993978234 +0000 UTC m=+1004.355804254" lastFinishedPulling="2026-02-25 07:34:34.458708449 +0000 UTC m=+1027.820534479" observedRunningTime="2026-02-25 07:34:38.308562379 +0000 UTC m=+1031.670388409" watchObservedRunningTime="2026-02-25 07:34:38.309133543 +0000 UTC m=+1031.670959563" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.360261 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" podStartSLOduration=6.250971495 podStartE2EDuration="29.360240025s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.352054712 +0000 UTC m=+1004.713880732" lastFinishedPulling="2026-02-25 07:34:34.461323222 +0000 UTC m=+1027.823149262" observedRunningTime="2026-02-25 07:34:38.341555965 +0000 UTC m=+1031.703381985" watchObservedRunningTime="2026-02-25 07:34:38.360240025 +0000 UTC m=+1031.722066045" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.385358 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" podStartSLOduration=2.830041125 podStartE2EDuration="28.38534213s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.845958487 +0000 UTC m=+1005.207784507" lastFinishedPulling="2026-02-25 07:34:37.401259492 +0000 UTC m=+1030.763085512" observedRunningTime="2026-02-25 07:34:38.381208101 +0000 UTC m=+1031.743034121" watchObservedRunningTime="2026-02-25 07:34:38.38534213 +0000 UTC m=+1031.747168150" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.403689 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" podStartSLOduration=2.477692046 podStartE2EDuration="28.403667793s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.59905715 +0000 UTC m=+1004.960883170" lastFinishedPulling="2026-02-25 07:34:37.525032887 +0000 UTC m=+1030.886858917" observedRunningTime="2026-02-25 07:34:38.401204794 +0000 UTC m=+1031.763030814" watchObservedRunningTime="2026-02-25 07:34:38.403667793 +0000 UTC m=+1031.765493813" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.424353 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" podStartSLOduration=5.974793393 podStartE2EDuration="29.424339551s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.009191091 +0000 UTC m=+1004.371017111" lastFinishedPulling="2026-02-25 07:34:34.458737249 +0000 UTC m=+1027.820563269" observedRunningTime="2026-02-25 07:34:38.421458852 +0000 UTC m=+1031.783284872" watchObservedRunningTime="2026-02-25 07:34:38.424339551 +0000 UTC m=+1031.786165561" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.457401 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" podStartSLOduration=6.584134823 podStartE2EDuration="29.457380789s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.587557443 +0000 UTC m=+1004.949383463" lastFinishedPulling="2026-02-25 07:34:34.460803409 +0000 UTC m=+1027.822629429" observedRunningTime="2026-02-25 07:34:38.455511023 +0000 UTC m=+1031.817337043" watchObservedRunningTime="2026-02-25 07:34:38.457380789 +0000 UTC m=+1031.819206809" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.481741 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" podStartSLOduration=2.639678473 podStartE2EDuration="28.481723746s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.684495021 +0000 UTC m=+1005.046321041" lastFinishedPulling="2026-02-25 07:34:37.526540284 +0000 UTC m=+1030.888366314" observedRunningTime="2026-02-25 07:34:38.478271403 +0000 UTC m=+1031.840097423" watchObservedRunningTime="2026-02-25 07:34:38.481723746 +0000 UTC m=+1031.843549766" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.522093 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" podStartSLOduration=2.695595161 podStartE2EDuration="28.522073419s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.705258152 +0000 UTC m=+1005.067084172" lastFinishedPulling="2026-02-25 07:34:37.531736399 +0000 UTC m=+1030.893562430" observedRunningTime="2026-02-25 07:34:38.518933044 +0000 UTC m=+1031.880759064" watchObservedRunningTime="2026-02-25 07:34:38.522073419 +0000 UTC m=+1031.883899439" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.538578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" podStartSLOduration=6.867294724 podStartE2EDuration="29.538557477s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.246047665 +0000 UTC m=+1004.607873685" lastFinishedPulling="2026-02-25 07:34:33.917310418 +0000 UTC m=+1027.279136438" observedRunningTime="2026-02-25 07:34:38.536805885 +0000 UTC m=+1031.898631905" watchObservedRunningTime="2026-02-25 07:34:38.538557477 +0000 UTC m=+1031.900383497" Feb 25 07:34:38 crc kubenswrapper[4749]: I0225 07:34:38.555136 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" podStartSLOduration=6.388470112 podStartE2EDuration="29.555098506s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.294170015 +0000 UTC m=+1004.655996035" lastFinishedPulling="2026-02-25 07:34:34.460798409 +0000 UTC m=+1027.822624429" observedRunningTime="2026-02-25 07:34:38.554527102 +0000 UTC m=+1031.916353122" watchObservedRunningTime="2026-02-25 07:34:38.555098506 +0000 UTC m=+1031.916924526" Feb 25 07:34:39 crc kubenswrapper[4749]: I0225 07:34:39.247508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.253627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" event={"ID":"0a7bf49d-b6d9-474b-b97c-cb555aa93f8a","Type":"ContainerStarted","Data":"2fe9341095415d9336330ca4cef03e7a309f8167c5c0d33a916cda54d59d1338"} Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.254026 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.254900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" event={"ID":"d3e727f4-f059-41f3-94d1-fef7a644f2b2","Type":"ContainerStarted","Data":"2255c9a7ca3d5c9aa7b2fdbb08d94f1fc4c6a04cc1fcfdc0f839d8e030f481a3"} Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.256042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" event={"ID":"9bae09cb-1d81-4971-82bc-84d87a6dca77","Type":"ContainerStarted","Data":"64655e294350ce0d1bf5f644668ea22903622e77933dda7f9ab29fe8ecb12ecc"} Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.256297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.273799 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" podStartSLOduration=2.8854303999999997 podStartE2EDuration="31.273785756s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.392277743 +0000 UTC m=+1004.754103763" lastFinishedPulling="2026-02-25 07:34:39.780633059 +0000 UTC m=+1033.142459119" observedRunningTime="2026-02-25 07:34:40.270859395 +0000 UTC m=+1033.632685415" watchObservedRunningTime="2026-02-25 07:34:40.273785756 +0000 UTC m=+1033.635611776" Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.292198 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" podStartSLOduration=3.083214639 podStartE2EDuration="31.292183669s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.568866782 +0000 UTC m=+1004.930692792" lastFinishedPulling="2026-02-25 07:34:39.777835802 +0000 UTC m=+1033.139661822" observedRunningTime="2026-02-25 07:34:40.28722767 +0000 UTC m=+1033.649053700" watchObservedRunningTime="2026-02-25 07:34:40.292183669 +0000 UTC m=+1033.654009689" Feb 25 07:34:40 crc kubenswrapper[4749]: I0225 07:34:40.312533 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-vx8bh" podStartSLOduration=2.368955254 podStartE2EDuration="30.31250828s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.834980472 +0000 UTC m=+1005.196806492" lastFinishedPulling="2026-02-25 07:34:39.778533498 +0000 UTC m=+1033.140359518" observedRunningTime="2026-02-25 07:34:40.302946079 +0000 UTC m=+1033.664772109" watchObservedRunningTime="2026-02-25 07:34:40.31250828 +0000 UTC m=+1033.674334310" Feb 25 07:34:41 crc kubenswrapper[4749]: I0225 07:34:41.789761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:41 crc kubenswrapper[4749]: I0225 07:34:41.799337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1ea3312-6b21-440a-b617-5681d286bcc4-cert\") pod \"infra-operator-controller-manager-79d975b745-rtn6b\" (UID: \"b1ea3312-6b21-440a-b617-5681d286bcc4\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:41 crc kubenswrapper[4749]: I0225 07:34:41.959396 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-drzcl" Feb 25 07:34:41 crc kubenswrapper[4749]: I0225 07:34:41.967373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.195799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.200136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f55e1ee-705c-409e-b34a-77232bf089eb-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c\" (UID: \"6f55e1ee-705c-409e-b34a-77232bf089eb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.425324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-k9967" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.433211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:42 crc kubenswrapper[4749]: W0225 07:34:42.483680 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ea3312_6b21_440a_b617_5681d286bcc4.slice/crio-82fa4bb86aabc741d464fc142290b5acdb6004fd8f7ea958d8da5668258a9a04 WatchSource:0}: Error finding container 82fa4bb86aabc741d464fc142290b5acdb6004fd8f7ea958d8da5668258a9a04: Status 404 returned error can't find the container with id 82fa4bb86aabc741d464fc142290b5acdb6004fd8f7ea958d8da5668258a9a04 Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.497101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b"] Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.501258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.501315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.509068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-metrics-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.509251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/37a9e86e-0ee0-4447-910a-a185f4681508-webhook-certs\") pod \"openstack-operator-controller-manager-6699bbbd4-8bgxl\" (UID: \"37a9e86e-0ee0-4447-910a-a185f4681508\") " pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.699607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c"] Feb 25 07:34:42 crc kubenswrapper[4749]: W0225 07:34:42.706930 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f55e1ee_705c_409e_b34a_77232bf089eb.slice/crio-63b08d6b57ebd7ffd25c159940ce58466ea1b8bdd71b50721f92384c24ee586e WatchSource:0}: Error finding container 63b08d6b57ebd7ffd25c159940ce58466ea1b8bdd71b50721f92384c24ee586e: Status 404 returned error can't find the container with id 63b08d6b57ebd7ffd25c159940ce58466ea1b8bdd71b50721f92384c24ee586e Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.718896 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-756xx" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.727573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:42 crc kubenswrapper[4749]: I0225 07:34:42.927667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl"] Feb 25 07:34:42 crc kubenswrapper[4749]: W0225 07:34:42.936085 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a9e86e_0ee0_4447_910a_a185f4681508.slice/crio-fdc89f7b5d24b4932bcd4e2a518b9c512d965f105d3df8032723f98368aaf104 WatchSource:0}: Error finding container fdc89f7b5d24b4932bcd4e2a518b9c512d965f105d3df8032723f98368aaf104: Status 404 returned error can't find the container with id fdc89f7b5d24b4932bcd4e2a518b9c512d965f105d3df8032723f98368aaf104 Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.280482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" event={"ID":"b1ea3312-6b21-440a-b617-5681d286bcc4","Type":"ContainerStarted","Data":"82fa4bb86aabc741d464fc142290b5acdb6004fd8f7ea958d8da5668258a9a04"} Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.282262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" event={"ID":"37a9e86e-0ee0-4447-910a-a185f4681508","Type":"ContainerStarted","Data":"6c07f3a9fa459137e1ab0825163f25f3bae3a0494284e3c26104e52a31c193a0"} Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.282330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" event={"ID":"37a9e86e-0ee0-4447-910a-a185f4681508","Type":"ContainerStarted","Data":"fdc89f7b5d24b4932bcd4e2a518b9c512d965f105d3df8032723f98368aaf104"} Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.282348 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.284480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" event={"ID":"6f55e1ee-705c-409e-b34a-77232bf089eb","Type":"ContainerStarted","Data":"63b08d6b57ebd7ffd25c159940ce58466ea1b8bdd71b50721f92384c24ee586e"} Feb 25 07:34:43 crc kubenswrapper[4749]: I0225 07:34:43.313930 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" podStartSLOduration=33.313914592 podStartE2EDuration="33.313914592s" podCreationTimestamp="2026-02-25 07:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:34:43.31386669 +0000 UTC m=+1036.675692710" watchObservedRunningTime="2026-02-25 07:34:43.313914592 +0000 UTC m=+1036.675740612" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.308029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" event={"ID":"b1ea3312-6b21-440a-b617-5681d286bcc4","Type":"ContainerStarted","Data":"0a94f268da921b642dc73ac553e336b96cfff7e63d3c0c0d1dd932918f8c8aea"} Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.308756 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.309662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" event={"ID":"0d2964a7-7341-4f1f-ab51-b648ea057535","Type":"ContainerStarted","Data":"9546a1b64d8a52c29ab7b2b78bd66ef0a73699b34d34f33c8b6f3be3bcab9842"} Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.309846 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.311039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" event={"ID":"6f55e1ee-705c-409e-b34a-77232bf089eb","Type":"ContainerStarted","Data":"866b129ab16d85c1a89e3ec6f495766053b68db00d0cbdcbbd3dc8cf1ed857ea"} Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.311183 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.323369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" podStartSLOduration=33.879590051 podStartE2EDuration="36.323354443s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:42.486333858 +0000 UTC m=+1035.848159918" lastFinishedPulling="2026-02-25 07:34:44.93009828 +0000 UTC m=+1038.291924310" observedRunningTime="2026-02-25 07:34:45.32284154 +0000 UTC m=+1038.684667570" watchObservedRunningTime="2026-02-25 07:34:45.323354443 +0000 UTC m=+1038.685180463" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.373943 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" podStartSLOduration=34.134925648 podStartE2EDuration="36.373928044s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:42.709781829 +0000 UTC m=+1036.071607839" lastFinishedPulling="2026-02-25 07:34:44.948784175 +0000 UTC m=+1038.310610235" observedRunningTime="2026-02-25 07:34:45.370809295 +0000 UTC m=+1038.732635315" watchObservedRunningTime="2026-02-25 07:34:45.373928044 +0000 UTC m=+1038.735754064" Feb 25 07:34:45 crc kubenswrapper[4749]: I0225 07:34:45.384946 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" podStartSLOduration=2.412781061 podStartE2EDuration="36.384925438s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:10.960781073 +0000 UTC m=+1004.322607093" lastFinishedPulling="2026-02-25 07:34:44.93292543 +0000 UTC m=+1038.294751470" observedRunningTime="2026-02-25 07:34:45.382682412 +0000 UTC m=+1038.744508432" watchObservedRunningTime="2026-02-25 07:34:45.384925438 +0000 UTC m=+1038.746751458" Feb 25 07:34:46 crc kubenswrapper[4749]: I0225 07:34:46.321568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" event={"ID":"17bb447b-e6f8-4a04-98ea-8559cbd26d34","Type":"ContainerStarted","Data":"3824c882c0875f939d94de6425965820df0e9193634e8d2ffc2a2ba4bf970103"} Feb 25 07:34:46 crc kubenswrapper[4749]: I0225 07:34:46.346188 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" podStartSLOduration=2.7921277829999998 podStartE2EDuration="37.34616426s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.202496644 +0000 UTC m=+1004.564322664" lastFinishedPulling="2026-02-25 07:34:45.756533071 +0000 UTC m=+1039.118359141" observedRunningTime="2026-02-25 07:34:46.343262567 +0000 UTC m=+1039.705088627" watchObservedRunningTime="2026-02-25 07:34:46.34616426 +0000 UTC m=+1039.707990310" Feb 25 07:34:49 crc kubenswrapper[4749]: I0225 07:34:49.346380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" event={"ID":"c881d86e-332d-419d-8c8a-9b7dfafe8c3c","Type":"ContainerStarted","Data":"ea5f3941a370703c200d667ecd21b6727dbb42aa0fb51ceb4fdfd8a55fc98c25"} Feb 25 07:34:49 crc kubenswrapper[4749]: I0225 07:34:49.347231 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:34:49 crc kubenswrapper[4749]: I0225 07:34:49.365088 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" podStartSLOduration=2.902058489 podStartE2EDuration="40.365072645s" podCreationTimestamp="2026-02-25 07:34:09 +0000 UTC" firstStartedPulling="2026-02-25 07:34:11.368474838 +0000 UTC m=+1004.730300858" lastFinishedPulling="2026-02-25 07:34:48.831488984 +0000 UTC m=+1042.193315014" observedRunningTime="2026-02-25 07:34:49.363752962 +0000 UTC m=+1042.725578982" watchObservedRunningTime="2026-02-25 07:34:49.365072645 +0000 UTC m=+1042.726898665" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.038963 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-9jh8w" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.091559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-4jdng" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.102528 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-7h9cc" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.214879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.220682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zjvss" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.230439 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-5nbpj" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.313239 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-fn75h" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.428087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-9tv6m" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.446764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-2tccs" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.450736 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-b2j2v" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.490608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-g4h2l" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.508363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-sptf9" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.540588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lrvt" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.580576 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-n4v8h" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.614281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5gt8h" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.647703 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-nqhfv" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.658179 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-wv258" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.678956 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-wh7zn" Feb 25 07:34:50 crc kubenswrapper[4749]: I0225 07:34:50.843567 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-7fzsq" Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.671571 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.671974 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.672041 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.672883 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.672980 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65" gracePeriod=600 Feb 25 07:34:51 crc kubenswrapper[4749]: I0225 07:34:51.975629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-rtn6b" Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.368986 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65" exitCode=0 Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.369046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65"} Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.369374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6"} Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.369397 4749 scope.go:117] "RemoveContainer" containerID="d080eb1a63c666c3e8a6fcb7fc91da3afac6fcafe4c078294e5ca59b14f88f13" Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.438369 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c" Feb 25 07:34:52 crc kubenswrapper[4749]: I0225 07:34:52.735356 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6699bbbd4-8bgxl" Feb 25 07:35:00 crc kubenswrapper[4749]: I0225 07:35:00.519912 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-q64h7" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.640140 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.642528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.645639 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.645888 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.646008 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6gkp5" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.646111 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.650007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.741581 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.743468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.745332 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.762427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.783870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.783946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ww6\" (UniqueName: \"kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.884726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ww6\" (UniqueName: \"kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.884983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.885114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.885223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6vw\" (UniqueName: \"kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.885297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.886498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.914678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ww6\" (UniqueName: \"kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6\") pod \"dnsmasq-dns-675f4bcbfc-m26lw\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.958588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.987006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6vw\" (UniqueName: \"kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.987144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.987250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.988120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:17 crc kubenswrapper[4749]: I0225 07:35:17.988563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.002320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6vw\" (UniqueName: \"kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw\") pod \"dnsmasq-dns-78dd6ddcc-4rd2g\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.058683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.363865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:18 crc kubenswrapper[4749]: W0225 07:35:18.369014 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ba2349e_1226_4442_8c8c_0826e886bc99.slice/crio-67f9b16b3f9535527e96c0d14bd5200b6ab379b6607374378349e7fccee9756f WatchSource:0}: Error finding container 67f9b16b3f9535527e96c0d14bd5200b6ab379b6607374378349e7fccee9756f: Status 404 returned error can't find the container with id 67f9b16b3f9535527e96c0d14bd5200b6ab379b6607374378349e7fccee9756f Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.513207 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:18 crc kubenswrapper[4749]: W0225 07:35:18.516119 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0889d84_d343_4b29_972f_0719818e2761.slice/crio-cb81bd2331fc59eefc9d6e0f0817e3c567235edc582e832914930fb6f715c6ca WatchSource:0}: Error finding container cb81bd2331fc59eefc9d6e0f0817e3c567235edc582e832914930fb6f715c6ca: Status 404 returned error can't find the container with id cb81bd2331fc59eefc9d6e0f0817e3c567235edc582e832914930fb6f715c6ca Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.597847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" event={"ID":"7ba2349e-1226-4442-8c8c-0826e886bc99","Type":"ContainerStarted","Data":"67f9b16b3f9535527e96c0d14bd5200b6ab379b6607374378349e7fccee9756f"} Feb 25 07:35:18 crc kubenswrapper[4749]: I0225 07:35:18.599452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" event={"ID":"c0889d84-d343-4b29-972f-0719818e2761","Type":"ContainerStarted","Data":"cb81bd2331fc59eefc9d6e0f0817e3c567235edc582e832914930fb6f715c6ca"} Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.124787 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.151071 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.152431 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.169755 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.219248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.219331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.219356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rtmr\" (UniqueName: \"kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.320730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.320778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rtmr\" (UniqueName: \"kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.320827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.321721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.322325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.361452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rtmr\" (UniqueName: \"kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr\") pod \"dnsmasq-dns-666b6646f7-kmd6j\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.427094 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.463129 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.464272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.473350 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.483179 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.624341 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.624712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg2gr\" (UniqueName: \"kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.624745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.726038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.726214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg2gr\" (UniqueName: \"kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.726343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.727350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.731413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.760062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg2gr\" (UniqueName: \"kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr\") pod \"dnsmasq-dns-57d769cc4f-bt5ms\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.782949 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:20 crc kubenswrapper[4749]: I0225 07:35:20.989673 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:35:20 crc kubenswrapper[4749]: W0225 07:35:20.993548 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62112103_059c_4b14_803d_8f6e6189f3b6.slice/crio-1985cc41a20ca0286403f3594cb1e4a5a9dc3690bacf62ccc4b37629d3f201a6 WatchSource:0}: Error finding container 1985cc41a20ca0286403f3594cb1e4a5a9dc3690bacf62ccc4b37629d3f201a6: Status 404 returned error can't find the container with id 1985cc41a20ca0286403f3594cb1e4a5a9dc3690bacf62ccc4b37629d3f201a6 Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.279138 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.281661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.285585 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.285585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.285857 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.285993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.287097 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.287287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tqltk" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.289727 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.298201 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.311872 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.438992 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfgxp\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.439409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.540979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfgxp\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.541946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.542503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.542513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.542781 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.543664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.543661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.548217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.548322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.548731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.549518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.567843 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.572409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.580136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfgxp\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp\") pod \"rabbitmq-server-0\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.587852 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.628210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.632282 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.632404 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.632672 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.632946 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.633151 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.633488 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jxcd4" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.634609 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.640510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.642779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.643973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.644009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.644029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klmg\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.644047 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.647367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" event={"ID":"62112103-059c-4b14-803d-8f6e6189f3b6","Type":"ContainerStarted","Data":"1985cc41a20ca0286403f3594cb1e4a5a9dc3690bacf62ccc4b37629d3f201a6"} Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.745494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.745978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klmg\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.746330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.748645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.749379 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.751903 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.752038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.752408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.752472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.756912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.757134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.761027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.763094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.768571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klmg\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.811298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:21 crc kubenswrapper[4749]: I0225 07:35:21.948778 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.884700 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.886220 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.888948 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bf5f5" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.890055 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.894873 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.894907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.898969 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 07:35:22 crc kubenswrapper[4749]: I0225 07:35:22.907017 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v29q\" (UniqueName: \"kubernetes.io/projected/725d8d83-d9a6-4c99-86f1-71371b41c11f-kube-api-access-5v29q\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066835 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-default\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.066869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.067126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-kolla-config\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.067337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-default\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-kolla-config\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v29q\" (UniqueName: \"kubernetes.io/projected/725d8d83-d9a6-4c99-86f1-71371b41c11f-kube-api-access-5v29q\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.169749 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.172152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.172686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-config-data-default\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.173151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-kolla-config\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.174131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/725d8d83-d9a6-4c99-86f1-71371b41c11f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.176377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.176399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/725d8d83-d9a6-4c99-86f1-71371b41c11f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.197469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v29q\" (UniqueName: \"kubernetes.io/projected/725d8d83-d9a6-4c99-86f1-71371b41c11f-kube-api-access-5v29q\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.198889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"725d8d83-d9a6-4c99-86f1-71371b41c11f\") " pod="openstack/openstack-galera-0" Feb 25 07:35:23 crc kubenswrapper[4749]: I0225 07:35:23.207480 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.379197 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.382122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.385816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.386326 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.386493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.386665 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6c8hb" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.394284 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.489810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490210 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8m8\" (UniqueName: \"kubernetes.io/projected/966f467d-732d-45df-b9d1-bb88be2e34cf-kube-api-access-zw8m8\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.490338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.591720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8m8\" (UniqueName: \"kubernetes.io/projected/966f467d-732d-45df-b9d1-bb88be2e34cf-kube-api-access-zw8m8\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.591855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.591881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.591935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.591989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.592019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.592044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.592088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.592459 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.593281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.593617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.594250 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.594912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/966f467d-732d-45df-b9d1-bb88be2e34cf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.597335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.599697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966f467d-732d-45df-b9d1-bb88be2e34cf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.613909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8m8\" (UniqueName: \"kubernetes.io/projected/966f467d-732d-45df-b9d1-bb88be2e34cf-kube-api-access-zw8m8\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.633821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"966f467d-732d-45df-b9d1-bb88be2e34cf\") " pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.643049 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.643969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.647125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vfbqp" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.647401 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.647569 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.672607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.698617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-config-data\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.698678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.698703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kolla-config\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.698743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s9sh\" (UniqueName: \"kubernetes.io/projected/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kube-api-access-9s9sh\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.699024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.728385 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.800711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kolla-config\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.800780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s9sh\" (UniqueName: \"kubernetes.io/projected/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kube-api-access-9s9sh\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.800840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.800875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-config-data\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.800895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.801298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kolla-config\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.802107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-config-data\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.804111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-combined-ca-bundle\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.809155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-memcached-tls-certs\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:24 crc kubenswrapper[4749]: I0225 07:35:24.822241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s9sh\" (UniqueName: \"kubernetes.io/projected/df3ffcd0-1fa9-4c25-a331-33baf2a3acfd-kube-api-access-9s9sh\") pod \"memcached-0\" (UID: \"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd\") " pod="openstack/memcached-0" Feb 25 07:35:25 crc kubenswrapper[4749]: I0225 07:35:25.013656 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 25 07:35:25 crc kubenswrapper[4749]: I0225 07:35:25.686196 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" event={"ID":"82616919-9ac9-4161-b12d-a83d80828514","Type":"ContainerStarted","Data":"c3ced8a39d0aebbe3eb63cac7ed023a0f9fdbf0f8b901b42f47a008d33cbb3b5"} Feb 25 07:35:26 crc kubenswrapper[4749]: I0225 07:35:26.823641 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:35:26 crc kubenswrapper[4749]: I0225 07:35:26.824762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:35:26 crc kubenswrapper[4749]: I0225 07:35:26.834249 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qj45j" Feb 25 07:35:26 crc kubenswrapper[4749]: I0225 07:35:26.838628 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:35:26 crc kubenswrapper[4749]: I0225 07:35:26.935799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxr6\" (UniqueName: \"kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6\") pod \"kube-state-metrics-0\" (UID: \"54b96c95-7c89-40c7-a51c-6a5c4c59a036\") " pod="openstack/kube-state-metrics-0" Feb 25 07:35:27 crc kubenswrapper[4749]: I0225 07:35:27.037068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxr6\" (UniqueName: \"kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6\") pod \"kube-state-metrics-0\" (UID: \"54b96c95-7c89-40c7-a51c-6a5c4c59a036\") " pod="openstack/kube-state-metrics-0" Feb 25 07:35:27 crc kubenswrapper[4749]: I0225 07:35:27.059431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxr6\" (UniqueName: \"kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6\") pod \"kube-state-metrics-0\" (UID: \"54b96c95-7c89-40c7-a51c-6a5c4c59a036\") " pod="openstack/kube-state-metrics-0" Feb 25 07:35:27 crc kubenswrapper[4749]: I0225 07:35:27.144960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.266131 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.267761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.270628 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.270898 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.271110 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.271214 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.271474 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5xmcc" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.279121 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jt2g"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.280094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.284092 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-v4wbp" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.284724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.284830 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.303405 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.348419 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.356468 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-c5k7v"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.357937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.372916 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-c5k7v"] Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-etc-ovs\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-lib\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-run\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-config\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-log-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md65c\" (UniqueName: \"kubernetes.io/projected/afd2522b-bd24-455e-bc5f-a62caba2ff23-kube-api-access-md65c\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75a58813-9b9d-4d38-aa91-527463cdbccf-scripts\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-combined-ca-bundle\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.430977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4lgf\" (UniqueName: \"kubernetes.io/projected/4197a74b-885d-41df-8484-05e645656b2a-kube-api-access-b4lgf\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-log\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2t8v\" (UniqueName: \"kubernetes.io/projected/75a58813-9b9d-4d38-aa91-527463cdbccf-kube-api-access-v2t8v\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-ovn-controller-tls-certs\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.431127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4197a74b-885d-41df-8484-05e645656b2a-scripts\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.531988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-combined-ca-bundle\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4lgf\" (UniqueName: \"kubernetes.io/projected/4197a74b-885d-41df-8484-05e645656b2a-kube-api-access-b4lgf\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-log\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2t8v\" (UniqueName: \"kubernetes.io/projected/75a58813-9b9d-4d38-aa91-527463cdbccf-kube-api-access-v2t8v\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-ovn-controller-tls-certs\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4197a74b-885d-41df-8484-05e645656b2a-scripts\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-etc-ovs\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-lib\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-run\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-config\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-log-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md65c\" (UniqueName: \"kubernetes.io/projected/afd2522b-bd24-455e-bc5f-a62caba2ff23-kube-api-access-md65c\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.532488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75a58813-9b9d-4d38-aa91-527463cdbccf-scripts\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-log-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-etc-ovs\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-run\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533345 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-lib\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.534476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4197a74b-885d-41df-8484-05e645656b2a-var-run-ovn\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.534827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.533346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/75a58813-9b9d-4d38-aa91-527463cdbccf-var-log\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.535146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75a58813-9b9d-4d38-aa91-527463cdbccf-scripts\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.536575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-config\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.536963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd2522b-bd24-455e-bc5f-a62caba2ff23-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.538116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-ovn-controller-tls-certs\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.538338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.546947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4197a74b-885d-41df-8484-05e645656b2a-scripts\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.547274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4197a74b-885d-41df-8484-05e645656b2a-combined-ca-bundle\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.547735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.549895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2t8v\" (UniqueName: \"kubernetes.io/projected/75a58813-9b9d-4d38-aa91-527463cdbccf-kube-api-access-v2t8v\") pod \"ovn-controller-ovs-c5k7v\" (UID: \"75a58813-9b9d-4d38-aa91-527463cdbccf\") " pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.550033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd2522b-bd24-455e-bc5f-a62caba2ff23-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.550677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md65c\" (UniqueName: \"kubernetes.io/projected/afd2522b-bd24-455e-bc5f-a62caba2ff23-kube-api-access-md65c\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.551419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4lgf\" (UniqueName: \"kubernetes.io/projected/4197a74b-885d-41df-8484-05e645656b2a-kube-api-access-b4lgf\") pod \"ovn-controller-9jt2g\" (UID: \"4197a74b-885d-41df-8484-05e645656b2a\") " pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.552822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"afd2522b-bd24-455e-bc5f-a62caba2ff23\") " pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.598534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.610713 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:31 crc kubenswrapper[4749]: I0225 07:35:31.673302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.241311 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.243828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.248924 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.250283 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.250978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-r782n" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.251807 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.252729 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.360154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.360211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.360569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.360772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.360940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.361036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.361097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.361211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bxc\" (UniqueName: \"kubernetes.io/projected/e8fa832b-da81-48f1-b4d6-e72688303d93-kube-api-access-f2bxc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bxc\" (UniqueName: \"kubernetes.io/projected/e8fa832b-da81-48f1-b4d6-e72688303d93-kube-api-access-f2bxc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.462482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.463171 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.463345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.463733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8fa832b-da81-48f1-b4d6-e72688303d93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.464954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.473114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.473167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.492370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8fa832b-da81-48f1-b4d6-e72688303d93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.514546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bxc\" (UniqueName: \"kubernetes.io/projected/e8fa832b-da81-48f1-b4d6-e72688303d93-kube-api-access-f2bxc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.520645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8fa832b-da81-48f1-b4d6-e72688303d93\") " pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:33 crc kubenswrapper[4749]: I0225 07:35:33.568459 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:34 crc kubenswrapper[4749]: E0225 07:35:34.387565 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 07:35:34 crc kubenswrapper[4749]: E0225 07:35:34.388105 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2ww6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-m26lw_openstack(7ba2349e-1226-4442-8c8c-0826e886bc99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:35:34 crc kubenswrapper[4749]: E0225 07:35:34.389941 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" podUID="7ba2349e-1226-4442-8c8c-0826e886bc99" Feb 25 07:35:34 crc kubenswrapper[4749]: I0225 07:35:34.662660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 25 07:35:34 crc kubenswrapper[4749]: I0225 07:35:34.750219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"725d8d83-d9a6-4c99-86f1-71371b41c11f","Type":"ContainerStarted","Data":"99ee13cf41090018ebd2fa8eda6cc2882ec93e0ef1c56e0e855ebaaa1d4b6f7e"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.047360 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.108601 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: W0225 07:35:35.119535 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54b96c95_7c89_40c7_a51c_6a5c4c59a036.slice/crio-53594703c8f052292af4dc3a4af1d5b822faa7d949e2e16580a536ca5a8aa8f7 WatchSource:0}: Error finding container 53594703c8f052292af4dc3a4af1d5b822faa7d949e2e16580a536ca5a8aa8f7: Status 404 returned error can't find the container with id 53594703c8f052292af4dc3a4af1d5b822faa7d949e2e16580a536ca5a8aa8f7 Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.146880 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.154361 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.163428 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.170871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.182429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.196339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config\") pod \"7ba2349e-1226-4442-8c8c-0826e886bc99\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.196397 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2ww6\" (UniqueName: \"kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6\") pod \"7ba2349e-1226-4442-8c8c-0826e886bc99\" (UID: \"7ba2349e-1226-4442-8c8c-0826e886bc99\") " Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.198333 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config" (OuterVolumeSpecName: "config") pod "7ba2349e-1226-4442-8c8c-0826e886bc99" (UID: "7ba2349e-1226-4442-8c8c-0826e886bc99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.204361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6" (OuterVolumeSpecName: "kube-api-access-w2ww6") pod "7ba2349e-1226-4442-8c8c-0826e886bc99" (UID: "7ba2349e-1226-4442-8c8c-0826e886bc99"). InnerVolumeSpecName "kube-api-access-w2ww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:35 crc kubenswrapper[4749]: W0225 07:35:35.233864 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd2522b_bd24_455e_bc5f_a62caba2ff23.slice/crio-919b8977ec065707c2c1b46f49296ccc9b1f29cfbfb156df1322cbc9b4a89e46 WatchSource:0}: Error finding container 919b8977ec065707c2c1b46f49296ccc9b1f29cfbfb156df1322cbc9b4a89e46: Status 404 returned error can't find the container with id 919b8977ec065707c2c1b46f49296ccc9b1f29cfbfb156df1322cbc9b4a89e46 Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.238498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.299051 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ba2349e-1226-4442-8c8c-0826e886bc99-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.299092 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2ww6\" (UniqueName: \"kubernetes.io/projected/7ba2349e-1226-4442-8c8c-0826e886bc99-kube-api-access-w2ww6\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:35 crc kubenswrapper[4749]: W0225 07:35:35.325050 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75a58813_9b9d_4d38_aa91_527463cdbccf.slice/crio-9b2aa530fcf87a977e63349e0b90468694ab5cceb1a03c25bf8b8368a301f6f6 WatchSource:0}: Error finding container 9b2aa530fcf87a977e63349e0b90468694ab5cceb1a03c25bf8b8368a301f6f6: Status 404 returned error can't find the container with id 9b2aa530fcf87a977e63349e0b90468694ab5cceb1a03c25bf8b8368a301f6f6 Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.336106 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-c5k7v"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.763758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g" event={"ID":"4197a74b-885d-41df-8484-05e645656b2a","Type":"ContainerStarted","Data":"f26d9082499d75a5b6e6933d35dd9c9fafe21fc40ff0535a2aa0331a3d75de3a"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.765575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54b96c95-7c89-40c7-a51c-6a5c4c59a036","Type":"ContainerStarted","Data":"53594703c8f052292af4dc3a4af1d5b822faa7d949e2e16580a536ca5a8aa8f7"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.767472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5k7v" event={"ID":"75a58813-9b9d-4d38-aa91-527463cdbccf","Type":"ContainerStarted","Data":"9b2aa530fcf87a977e63349e0b90468694ab5cceb1a03c25bf8b8368a301f6f6"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.769445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerStarted","Data":"d97b9555c9cace15b14eb9eda24cd7d0c285b8c79a14f23ab46313e31dd0fae0"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.771444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"afd2522b-bd24-455e-bc5f-a62caba2ff23","Type":"ContainerStarted","Data":"919b8977ec065707c2c1b46f49296ccc9b1f29cfbfb156df1322cbc9b4a89e46"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.772794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerStarted","Data":"acd0ea76057e6bb9b468d33b843f8e8701fd73e07169e1c88cf7fb42daf8b3c9"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.774830 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.775394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m26lw" event={"ID":"7ba2349e-1226-4442-8c8c-0826e886bc99","Type":"ContainerDied","Data":"67f9b16b3f9535527e96c0d14bd5200b6ab379b6607374378349e7fccee9756f"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.778005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"966f467d-732d-45df-b9d1-bb88be2e34cf","Type":"ContainerStarted","Data":"ab5b0190eaf989f41e6085a3020a115202f3608b0e2f94bdc2cf460bcb0edf24"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.784212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd","Type":"ContainerStarted","Data":"ed7bf861c83c45fe67a6a0b54b5587bd94f4950a9ddc9a847ce41eed65f63fa3"} Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.798952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 25 07:35:35 crc kubenswrapper[4749]: W0225 07:35:35.807277 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8fa832b_da81_48f1_b4d6_e72688303d93.slice/crio-df5ba40286da0676ce679f876fe07b9296f8729583111d7ac9eeececcdd8d7a7 WatchSource:0}: Error finding container df5ba40286da0676ce679f876fe07b9296f8729583111d7ac9eeececcdd8d7a7: Status 404 returned error can't find the container with id df5ba40286da0676ce679f876fe07b9296f8729583111d7ac9eeececcdd8d7a7 Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.854136 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:35 crc kubenswrapper[4749]: I0225 07:35:35.854206 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m26lw"] Feb 25 07:35:36 crc kubenswrapper[4749]: E0225 07:35:36.014015 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 25 07:35:36 crc kubenswrapper[4749]: E0225 07:35:36.014185 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ql6vw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4rd2g_openstack(c0889d84-d343-4b29-972f-0719818e2761): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:35:36 crc kubenswrapper[4749]: E0225 07:35:36.015420 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" podUID="c0889d84-d343-4b29-972f-0719818e2761" Feb 25 07:35:36 crc kubenswrapper[4749]: I0225 07:35:36.793066 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8fa832b-da81-48f1-b4d6-e72688303d93","Type":"ContainerStarted","Data":"df5ba40286da0676ce679f876fe07b9296f8729583111d7ac9eeececcdd8d7a7"} Feb 25 07:35:36 crc kubenswrapper[4749]: I0225 07:35:36.796057 4749 generic.go:334] "Generic (PLEG): container finished" podID="62112103-059c-4b14-803d-8f6e6189f3b6" containerID="0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a" exitCode=0 Feb 25 07:35:36 crc kubenswrapper[4749]: I0225 07:35:36.796100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" event={"ID":"62112103-059c-4b14-803d-8f6e6189f3b6","Type":"ContainerDied","Data":"0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a"} Feb 25 07:35:37 crc kubenswrapper[4749]: I0225 07:35:37.330768 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba2349e-1226-4442-8c8c-0826e886bc99" path="/var/lib/kubelet/pods/7ba2349e-1226-4442-8c8c-0826e886bc99/volumes" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.595351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.669717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql6vw\" (UniqueName: \"kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw\") pod \"c0889d84-d343-4b29-972f-0719818e2761\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.669912 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc\") pod \"c0889d84-d343-4b29-972f-0719818e2761\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.670081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config\") pod \"c0889d84-d343-4b29-972f-0719818e2761\" (UID: \"c0889d84-d343-4b29-972f-0719818e2761\") " Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.670379 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0889d84-d343-4b29-972f-0719818e2761" (UID: "c0889d84-d343-4b29-972f-0719818e2761"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.670779 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.670891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config" (OuterVolumeSpecName: "config") pod "c0889d84-d343-4b29-972f-0719818e2761" (UID: "c0889d84-d343-4b29-972f-0719818e2761"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.689523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw" (OuterVolumeSpecName: "kube-api-access-ql6vw") pod "c0889d84-d343-4b29-972f-0719818e2761" (UID: "c0889d84-d343-4b29-972f-0719818e2761"). InnerVolumeSpecName "kube-api-access-ql6vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.772047 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0889d84-d343-4b29-972f-0719818e2761-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.772080 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql6vw\" (UniqueName: \"kubernetes.io/projected/c0889d84-d343-4b29-972f-0719818e2761-kube-api-access-ql6vw\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.811575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" event={"ID":"c0889d84-d343-4b29-972f-0719818e2761","Type":"ContainerDied","Data":"cb81bd2331fc59eefc9d6e0f0817e3c567235edc582e832914930fb6f715c6ca"} Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.811688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4rd2g" Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.912712 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:38 crc kubenswrapper[4749]: I0225 07:35:38.917318 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4rd2g"] Feb 25 07:35:39 crc kubenswrapper[4749]: I0225 07:35:39.336620 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0889d84-d343-4b29-972f-0719818e2761" path="/var/lib/kubelet/pods/c0889d84-d343-4b29-972f-0719818e2761/volumes" Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.874363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5k7v" event={"ID":"75a58813-9b9d-4d38-aa91-527463cdbccf","Type":"ContainerStarted","Data":"4269fb1cb70e858b8055ddf751c4a0976f7788b4bd4379ba8d1c7dffa029b598"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.877586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8fa832b-da81-48f1-b4d6-e72688303d93","Type":"ContainerStarted","Data":"6cf856efb73b59a8c7edf861d6519d5ca700285e235d83705f5d1d82cbab5376"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.885440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"afd2522b-bd24-455e-bc5f-a62caba2ff23","Type":"ContainerStarted","Data":"d3a49846073be6840e17e4bf50de1befb7db5ce978502c059decc0071975384c"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.889042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" event={"ID":"62112103-059c-4b14-803d-8f6e6189f3b6","Type":"ContainerStarted","Data":"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.889245 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.892197 4749 generic.go:334] "Generic (PLEG): container finished" podID="82616919-9ac9-4161-b12d-a83d80828514" containerID="9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1" exitCode=0 Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.892324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" event={"ID":"82616919-9ac9-4161-b12d-a83d80828514","Type":"ContainerDied","Data":"9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.902014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"725d8d83-d9a6-4c99-86f1-71371b41c11f","Type":"ContainerStarted","Data":"a1a3208ec8eadd16bda9e19d9e644b7f9270c38ebf43c99bacf4e4f5eced643f"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.913057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"966f467d-732d-45df-b9d1-bb88be2e34cf","Type":"ContainerStarted","Data":"f89f50b6aa493db3bc847098987a3bea77c6875e9bcd8d8e252e3bbe67365b51"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.924319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"df3ffcd0-1fa9-4c25-a331-33baf2a3acfd","Type":"ContainerStarted","Data":"e7159db567de85ff90a28fb66d6fa3d992464dce8e12595f3ca6f4c265672436"} Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.924644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 25 07:35:44 crc kubenswrapper[4749]: I0225 07:35:44.992488 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" podStartSLOduration=9.937555861 podStartE2EDuration="24.992468132s" podCreationTimestamp="2026-02-25 07:35:20 +0000 UTC" firstStartedPulling="2026-02-25 07:35:20.997195505 +0000 UTC m=+1074.359021525" lastFinishedPulling="2026-02-25 07:35:36.052107736 +0000 UTC m=+1089.413933796" observedRunningTime="2026-02-25 07:35:44.990836732 +0000 UTC m=+1098.352662752" watchObservedRunningTime="2026-02-25 07:35:44.992468132 +0000 UTC m=+1098.354294162" Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.933869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" event={"ID":"82616919-9ac9-4161-b12d-a83d80828514","Type":"ContainerStarted","Data":"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.934162 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.941144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g" event={"ID":"4197a74b-885d-41df-8484-05e645656b2a","Type":"ContainerStarted","Data":"a186a4a34a65d08d4515d6854106920ff1f6facad7aceed616f2f5b657eaadb7"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.941850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9jt2g" Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.943929 4749 generic.go:334] "Generic (PLEG): container finished" podID="75a58813-9b9d-4d38-aa91-527463cdbccf" containerID="4269fb1cb70e858b8055ddf751c4a0976f7788b4bd4379ba8d1c7dffa029b598" exitCode=0 Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.944010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5k7v" event={"ID":"75a58813-9b9d-4d38-aa91-527463cdbccf","Type":"ContainerDied","Data":"4269fb1cb70e858b8055ddf751c4a0976f7788b4bd4379ba8d1c7dffa029b598"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.946029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerStarted","Data":"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.947443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerStarted","Data":"e913d3a116a8d0df455db3e543a9b853ca9274c3337b06248d654bc6853a9e36"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.949061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54b96c95-7c89-40c7-a51c-6a5c4c59a036","Type":"ContainerStarted","Data":"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391"} Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.963597 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" podStartSLOduration=8.223608256 podStartE2EDuration="25.96357545s" podCreationTimestamp="2026-02-25 07:35:20 +0000 UTC" firstStartedPulling="2026-02-25 07:35:25.167295548 +0000 UTC m=+1078.529121568" lastFinishedPulling="2026-02-25 07:35:42.907262742 +0000 UTC m=+1096.269088762" observedRunningTime="2026-02-25 07:35:45.962006751 +0000 UTC m=+1099.323832771" watchObservedRunningTime="2026-02-25 07:35:45.96357545 +0000 UTC m=+1099.325401480" Feb 25 07:35:45 crc kubenswrapper[4749]: I0225 07:35:45.965049 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.187357404 podStartE2EDuration="21.965039897s" podCreationTimestamp="2026-02-25 07:35:24 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.127683841 +0000 UTC m=+1088.489509861" lastFinishedPulling="2026-02-25 07:35:42.905366304 +0000 UTC m=+1096.267192354" observedRunningTime="2026-02-25 07:35:45.036087399 +0000 UTC m=+1098.397913419" watchObservedRunningTime="2026-02-25 07:35:45.965039897 +0000 UTC m=+1099.326865927" Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.010663 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9jt2g" podStartSLOduration=7.2676337140000005 podStartE2EDuration="15.010640433s" podCreationTimestamp="2026-02-25 07:35:31 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.16411525 +0000 UTC m=+1088.525941270" lastFinishedPulling="2026-02-25 07:35:42.907121969 +0000 UTC m=+1096.268947989" observedRunningTime="2026-02-25 07:35:46.006134091 +0000 UTC m=+1099.367960111" watchObservedRunningTime="2026-02-25 07:35:46.010640433 +0000 UTC m=+1099.372466463" Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.050640 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.062492214 podStartE2EDuration="20.050576289s" podCreationTimestamp="2026-02-25 07:35:26 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.123705143 +0000 UTC m=+1088.485531163" lastFinishedPulling="2026-02-25 07:35:44.111789178 +0000 UTC m=+1097.473615238" observedRunningTime="2026-02-25 07:35:46.045728108 +0000 UTC m=+1099.407554138" watchObservedRunningTime="2026-02-25 07:35:46.050576289 +0000 UTC m=+1099.412402309" Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.961297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5k7v" event={"ID":"75a58813-9b9d-4d38-aa91-527463cdbccf","Type":"ContainerStarted","Data":"79e5b6229b505cbdd46f21c4a7acf53ff8a642d945141cf00371da3b93aa53ff"} Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.963420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-c5k7v" event={"ID":"75a58813-9b9d-4d38-aa91-527463cdbccf","Type":"ContainerStarted","Data":"e3b06942c4d7a8f88ec8323139f020cba756875db1137e7fe47ed189d4b4017f"} Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.963624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.963766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.964348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8fa832b-da81-48f1-b4d6-e72688303d93","Type":"ContainerStarted","Data":"f6f07b05befddba0da82f2430468bf1ee4fedcbdc69e89ba6906955380cb87b7"} Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.966939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"afd2522b-bd24-455e-bc5f-a62caba2ff23","Type":"ContainerStarted","Data":"2f59ebea6a070a8535e4c381385b204fb4d410f68810e87a9f990efe193e5a08"} Feb 25 07:35:46 crc kubenswrapper[4749]: I0225 07:35:46.967953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 07:35:47 crc kubenswrapper[4749]: I0225 07:35:47.002058 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-c5k7v" podStartSLOduration=8.423945159 podStartE2EDuration="16.002027696s" podCreationTimestamp="2026-02-25 07:35:31 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.327205245 +0000 UTC m=+1088.689031265" lastFinishedPulling="2026-02-25 07:35:42.905287782 +0000 UTC m=+1096.267113802" observedRunningTime="2026-02-25 07:35:46.994322795 +0000 UTC m=+1100.356148855" watchObservedRunningTime="2026-02-25 07:35:47.002027696 +0000 UTC m=+1100.363853756" Feb 25 07:35:47 crc kubenswrapper[4749]: I0225 07:35:47.022726 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.617406078 podStartE2EDuration="15.022694192s" podCreationTimestamp="2026-02-25 07:35:32 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.811475947 +0000 UTC m=+1089.173301977" lastFinishedPulling="2026-02-25 07:35:46.216764061 +0000 UTC m=+1099.578590091" observedRunningTime="2026-02-25 07:35:47.015060082 +0000 UTC m=+1100.376886132" watchObservedRunningTime="2026-02-25 07:35:47.022694192 +0000 UTC m=+1100.384520242" Feb 25 07:35:47 crc kubenswrapper[4749]: I0225 07:35:47.051217 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.064363862 podStartE2EDuration="17.051199572s" podCreationTimestamp="2026-02-25 07:35:30 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.23833093 +0000 UTC m=+1088.600156950" lastFinishedPulling="2026-02-25 07:35:46.22516664 +0000 UTC m=+1099.586992660" observedRunningTime="2026-02-25 07:35:47.047042718 +0000 UTC m=+1100.408868778" watchObservedRunningTime="2026-02-25 07:35:47.051199572 +0000 UTC m=+1100.413025602" Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.569833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.569908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.640263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.987728 4749 generic.go:334] "Generic (PLEG): container finished" podID="725d8d83-d9a6-4c99-86f1-71371b41c11f" containerID="a1a3208ec8eadd16bda9e19d9e644b7f9270c38ebf43c99bacf4e4f5eced643f" exitCode=0 Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.987787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"725d8d83-d9a6-4c99-86f1-71371b41c11f","Type":"ContainerDied","Data":"a1a3208ec8eadd16bda9e19d9e644b7f9270c38ebf43c99bacf4e4f5eced643f"} Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.992068 4749 generic.go:334] "Generic (PLEG): container finished" podID="966f467d-732d-45df-b9d1-bb88be2e34cf" containerID="f89f50b6aa493db3bc847098987a3bea77c6875e9bcd8d8e252e3bbe67365b51" exitCode=0 Feb 25 07:35:48 crc kubenswrapper[4749]: I0225 07:35:48.992229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"966f467d-732d-45df-b9d1-bb88be2e34cf","Type":"ContainerDied","Data":"f89f50b6aa493db3bc847098987a3bea77c6875e9bcd8d8e252e3bbe67365b51"} Feb 25 07:35:49 crc kubenswrapper[4749]: I0225 07:35:49.599673 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:49 crc kubenswrapper[4749]: I0225 07:35:49.661969 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.003871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"725d8d83-d9a6-4c99-86f1-71371b41c11f","Type":"ContainerStarted","Data":"53af0411c41551527ef6c9dde6b737f171a0fae5edfcafd7ac38a399f3696720"} Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.009343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"966f467d-732d-45df-b9d1-bb88be2e34cf","Type":"ContainerStarted","Data":"70a2ac5135a115c506cc868ffa36d59fa323feb92ae595af21b46d25e1917eb5"} Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.010045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.016047 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.033634 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.800151774 podStartE2EDuration="29.033583707s" podCreationTimestamp="2026-02-25 07:35:21 +0000 UTC" firstStartedPulling="2026-02-25 07:35:34.671807748 +0000 UTC m=+1088.033633768" lastFinishedPulling="2026-02-25 07:35:42.905239681 +0000 UTC m=+1096.267065701" observedRunningTime="2026-02-25 07:35:50.032360207 +0000 UTC m=+1103.394186287" watchObservedRunningTime="2026-02-25 07:35:50.033583707 +0000 UTC m=+1103.395409797" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.071298 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.148005025 podStartE2EDuration="27.071272247s" podCreationTimestamp="2026-02-25 07:35:23 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.173154085 +0000 UTC m=+1088.534980105" lastFinishedPulling="2026-02-25 07:35:43.096421297 +0000 UTC m=+1096.458247327" observedRunningTime="2026-02-25 07:35:50.063654957 +0000 UTC m=+1103.425481057" watchObservedRunningTime="2026-02-25 07:35:50.071272247 +0000 UTC m=+1103.433098297" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.074303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.095180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.396485 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.396728 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="dnsmasq-dns" containerID="cri-o://acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb" gracePeriod=10 Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.400773 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.476488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.538762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sxrj6"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.541246 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.562226 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.569210 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sxrj6"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.591058 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rtctt"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.592241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.594670 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.608064 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rtctt"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.672503 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sxrj6"] Feb 25 07:35:50 crc kubenswrapper[4749]: E0225 07:35:50.679380 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-pdwst ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" podUID="a59498f2-f095-4d9b-8a5d-05dce88ee32a" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.695978 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.701014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.704418 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.713045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-combined-ca-bundle\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftvx\" (UniqueName: \"kubernetes.io/projected/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-kube-api-access-8ftvx\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njp7x\" (UniqueName: \"kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.720973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdwst\" (UniqueName: \"kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovs-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-config\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovn-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.721147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.724818 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.726261 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.729064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.729424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.729542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.734499 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xhmwc" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.751386 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdwst\" (UniqueName: \"kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovs-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-config\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovn-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftvx\" (UniqueName: \"kubernetes.io/projected/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-kube-api-access-8ftvx\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-combined-ca-bundle\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njp7x\" (UniqueName: \"kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.822971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.823000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.823020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.823020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.823681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.824246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.824920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovn-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.824926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-ovs-rundir\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.825351 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-config\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.825472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.825502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.825707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.826040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.829063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.836213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-combined-ca-bundle\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.839109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftvx\" (UniqueName: \"kubernetes.io/projected/cd7c992f-27dd-409a-a7db-34b40e2ed6eb-kube-api-access-8ftvx\") pod \"ovn-controller-metrics-rtctt\" (UID: \"cd7c992f-27dd-409a-a7db-34b40e2ed6eb\") " pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.842855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njp7x\" (UniqueName: \"kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x\") pod \"dnsmasq-dns-8554648995-wcbx6\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.844711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdwst\" (UniqueName: \"kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst\") pod \"dnsmasq-dns-5bf47b49b7-sxrj6\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9t2\" (UniqueName: \"kubernetes.io/projected/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-kube-api-access-2n9t2\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-config\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.924990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-scripts\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.925048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.949391 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rtctt" Feb 25 07:35:50 crc kubenswrapper[4749]: I0225 07:35:50.951170 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017434 4749 generic.go:334] "Generic (PLEG): container finished" podID="82616919-9ac9-4161-b12d-a83d80828514" containerID="acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb" exitCode=0 Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" event={"ID":"82616919-9ac9-4161-b12d-a83d80828514","Type":"ContainerDied","Data":"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb"} Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017525 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" event={"ID":"82616919-9ac9-4161-b12d-a83d80828514","Type":"ContainerDied","Data":"c3ced8a39d0aebbe3eb63cac7ed023a0f9fdbf0f8b901b42f47a008d33cbb3b5"} Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017575 4749 scope.go:117] "RemoveContainer" containerID="acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.017510 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.025974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-scripts\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9t2\" (UniqueName: \"kubernetes.io/projected/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-kube-api-access-2n9t2\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.026313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-config\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.027137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-config\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.029104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.029248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-scripts\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.031374 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.035078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.043321 4749 scope.go:117] "RemoveContainer" containerID="9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.048307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.048931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.054205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.054360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9t2\" (UniqueName: \"kubernetes.io/projected/d644537f-b5d7-4f78-be98-d61b2f1d6ac3-kube-api-access-2n9t2\") pod \"ovn-northd-0\" (UID: \"d644537f-b5d7-4f78-be98-d61b2f1d6ac3\") " pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.064110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.129620 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb\") pod \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.129702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc\") pod \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.129732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg2gr\" (UniqueName: \"kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr\") pod \"82616919-9ac9-4161-b12d-a83d80828514\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.129798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config\") pod \"82616919-9ac9-4161-b12d-a83d80828514\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.129958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config\") pod \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.130018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdwst\" (UniqueName: \"kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst\") pod \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\" (UID: \"a59498f2-f095-4d9b-8a5d-05dce88ee32a\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.130111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc\") pod \"82616919-9ac9-4161-b12d-a83d80828514\" (UID: \"82616919-9ac9-4161-b12d-a83d80828514\") " Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.130140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a59498f2-f095-4d9b-8a5d-05dce88ee32a" (UID: "a59498f2-f095-4d9b-8a5d-05dce88ee32a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.130658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a59498f2-f095-4d9b-8a5d-05dce88ee32a" (UID: "a59498f2-f095-4d9b-8a5d-05dce88ee32a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.131087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config" (OuterVolumeSpecName: "config") pod "a59498f2-f095-4d9b-8a5d-05dce88ee32a" (UID: "a59498f2-f095-4d9b-8a5d-05dce88ee32a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.131151 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.131463 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.138752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst" (OuterVolumeSpecName: "kube-api-access-pdwst") pod "a59498f2-f095-4d9b-8a5d-05dce88ee32a" (UID: "a59498f2-f095-4d9b-8a5d-05dce88ee32a"). InnerVolumeSpecName "kube-api-access-pdwst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.140796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr" (OuterVolumeSpecName: "kube-api-access-sg2gr") pod "82616919-9ac9-4161-b12d-a83d80828514" (UID: "82616919-9ac9-4161-b12d-a83d80828514"). InnerVolumeSpecName "kube-api-access-sg2gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.160523 4749 scope.go:117] "RemoveContainer" containerID="acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb" Feb 25 07:35:51 crc kubenswrapper[4749]: E0225 07:35:51.160926 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb\": container with ID starting with acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb not found: ID does not exist" containerID="acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.160956 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb"} err="failed to get container status \"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb\": rpc error: code = NotFound desc = could not find container \"acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb\": container with ID starting with acabf129af6abee95e5c8d4ba856ceb114719a9a1599fca0b440ed231063f1cb not found: ID does not exist" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.160982 4749 scope.go:117] "RemoveContainer" containerID="9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1" Feb 25 07:35:51 crc kubenswrapper[4749]: E0225 07:35:51.161442 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1\": container with ID starting with 9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1 not found: ID does not exist" containerID="9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.161516 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1"} err="failed to get container status \"9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1\": rpc error: code = NotFound desc = could not find container \"9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1\": container with ID starting with 9042f81d6526295d9f2755526d45c91a7bd3697e44dc859c3fa2da4e6a854ab1 not found: ID does not exist" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.176748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config" (OuterVolumeSpecName: "config") pod "82616919-9ac9-4161-b12d-a83d80828514" (UID: "82616919-9ac9-4161-b12d-a83d80828514"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.198589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82616919-9ac9-4161-b12d-a83d80828514" (UID: "82616919-9ac9-4161-b12d-a83d80828514"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.232947 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdwst\" (UniqueName: \"kubernetes.io/projected/a59498f2-f095-4d9b-8a5d-05dce88ee32a-kube-api-access-pdwst\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.233403 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.233422 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg2gr\" (UniqueName: \"kubernetes.io/projected/82616919-9ac9-4161-b12d-a83d80828514-kube-api-access-sg2gr\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.233435 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82616919-9ac9-4161-b12d-a83d80828514-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.233443 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59498f2-f095-4d9b-8a5d-05dce88ee32a-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.362128 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.368239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-bt5ms"] Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.423344 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rtctt"] Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.546168 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:51 crc kubenswrapper[4749]: W0225 07:35:51.553838 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841e48d4_83bd_4321_81bf_a97e51697a0f.slice/crio-f258ccbb12acd4856f4ccd2035d789a9d6f700cf051e4d58054ded05519e1b82 WatchSource:0}: Error finding container f258ccbb12acd4856f4ccd2035d789a9d6f700cf051e4d58054ded05519e1b82: Status 404 returned error can't find the container with id f258ccbb12acd4856f4ccd2035d789a9d6f700cf051e4d58054ded05519e1b82 Feb 25 07:35:51 crc kubenswrapper[4749]: W0225 07:35:51.638226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd644537f_b5d7_4f78_be98_d61b2f1d6ac3.slice/crio-d9981641edb7b6f15c8747e44bf9b7b3b5b8f8a9bdd8cd238da086f829f6519e WatchSource:0}: Error finding container d9981641edb7b6f15c8747e44bf9b7b3b5b8f8a9bdd8cd238da086f829f6519e: Status 404 returned error can't find the container with id d9981641edb7b6f15c8747e44bf9b7b3b5b8f8a9bdd8cd238da086f829f6519e Feb 25 07:35:51 crc kubenswrapper[4749]: I0225 07:35:51.638717 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.031061 4749 generic.go:334] "Generic (PLEG): container finished" podID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerID="bbc1e21e2179a3fa98dbcc47a9d0d51802368d9522d64017f8b72596383e4500" exitCode=0 Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.031388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wcbx6" event={"ID":"841e48d4-83bd-4321-81bf-a97e51697a0f","Type":"ContainerDied","Data":"bbc1e21e2179a3fa98dbcc47a9d0d51802368d9522d64017f8b72596383e4500"} Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.031899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wcbx6" event={"ID":"841e48d4-83bd-4321-81bf-a97e51697a0f","Type":"ContainerStarted","Data":"f258ccbb12acd4856f4ccd2035d789a9d6f700cf051e4d58054ded05519e1b82"} Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.036389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d644537f-b5d7-4f78-be98-d61b2f1d6ac3","Type":"ContainerStarted","Data":"d9981641edb7b6f15c8747e44bf9b7b3b5b8f8a9bdd8cd238da086f829f6519e"} Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.038439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rtctt" event={"ID":"cd7c992f-27dd-409a-a7db-34b40e2ed6eb","Type":"ContainerStarted","Data":"4b1de262386be183b313c6c6321afcc947e0edd1e3d89d0a499acfb2af21ebd2"} Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.038481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rtctt" event={"ID":"cd7c992f-27dd-409a-a7db-34b40e2ed6eb","Type":"ContainerStarted","Data":"5a7a748bcbbbbc3ceed8f32f9bb2e69e7277c1999349e9060d791dd440661d0c"} Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.039695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sxrj6" Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.168555 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sxrj6"] Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.201216 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rtctt" podStartSLOduration=2.201195372 podStartE2EDuration="2.201195372s" podCreationTimestamp="2026-02-25 07:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:35:52.113817524 +0000 UTC m=+1105.475643564" watchObservedRunningTime="2026-02-25 07:35:52.201195372 +0000 UTC m=+1105.563021402" Feb 25 07:35:52 crc kubenswrapper[4749]: I0225 07:35:52.206571 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sxrj6"] Feb 25 07:35:52 crc kubenswrapper[4749]: E0225 07:35:52.321608 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:58794->38.102.83.151:43635: write tcp 38.102.83.151:58794->38.102.83.151:43635: write: broken pipe Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.069233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wcbx6" event={"ID":"841e48d4-83bd-4321-81bf-a97e51697a0f","Type":"ContainerStarted","Data":"5dc21ca65a23c4b4f9ebbca185bbfbcc4fe36e5e88a1158fa1026d7ca40fbaff"} Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.069563 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.075189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d644537f-b5d7-4f78-be98-d61b2f1d6ac3","Type":"ContainerStarted","Data":"a2b454f4a63fba3c9093e7d9093011914e629278f0fadd191fe3b480f05ca015"} Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.075246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d644537f-b5d7-4f78-be98-d61b2f1d6ac3","Type":"ContainerStarted","Data":"cdac9874162ecdf5eff71131f83f2e122477fc66c3319c44ecf3a0b5c7aac45c"} Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.075436 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.094885 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-wcbx6" podStartSLOduration=3.094857589 podStartE2EDuration="3.094857589s" podCreationTimestamp="2026-02-25 07:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:35:53.092990533 +0000 UTC m=+1106.454816543" watchObservedRunningTime="2026-02-25 07:35:53.094857589 +0000 UTC m=+1106.456683649" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.121825 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.090452841 podStartE2EDuration="3.121805631s" podCreationTimestamp="2026-02-25 07:35:50 +0000 UTC" firstStartedPulling="2026-02-25 07:35:51.647769126 +0000 UTC m=+1105.009595146" lastFinishedPulling="2026-02-25 07:35:52.679121896 +0000 UTC m=+1106.040947936" observedRunningTime="2026-02-25 07:35:53.109991907 +0000 UTC m=+1106.471817957" watchObservedRunningTime="2026-02-25 07:35:53.121805631 +0000 UTC m=+1106.483631651" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.208569 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.208848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.334892 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82616919-9ac9-4161-b12d-a83d80828514" path="/var/lib/kubelet/pods/82616919-9ac9-4161-b12d-a83d80828514/volumes" Feb 25 07:35:53 crc kubenswrapper[4749]: I0225 07:35:53.336358 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59498f2-f095-4d9b-8a5d-05dce88ee32a" path="/var/lib/kubelet/pods/a59498f2-f095-4d9b-8a5d-05dce88ee32a/volumes" Feb 25 07:35:53 crc kubenswrapper[4749]: E0225 07:35:53.697010 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:58822->38.102.83.151:43635: write tcp 38.102.83.151:58822->38.102.83.151:43635: write: broken pipe Feb 25 07:35:54 crc kubenswrapper[4749]: I0225 07:35:54.147760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 25 07:35:54 crc kubenswrapper[4749]: I0225 07:35:54.730066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:54 crc kubenswrapper[4749]: I0225 07:35:54.730564 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.198002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.784091 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-bt5ms" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.101:5353: i/o timeout" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.972355 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6259-account-create-update-5zpf8"] Feb 25 07:35:55 crc kubenswrapper[4749]: E0225 07:35:55.972837 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="init" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.972866 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="init" Feb 25 07:35:55 crc kubenswrapper[4749]: E0225 07:35:55.972920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="dnsmasq-dns" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.972933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="dnsmasq-dns" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.973196 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="82616919-9ac9-4161-b12d-a83d80828514" containerName="dnsmasq-dns" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.973998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.977766 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 25 07:35:55 crc kubenswrapper[4749]: I0225 07:35:55.995146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6259-account-create-update-5zpf8"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.023525 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mqghp"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.025625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.027883 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mqghp"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.052262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5xp\" (UniqueName: \"kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.052374 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.153232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5xp\" (UniqueName: \"kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.153297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.153363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk528\" (UniqueName: \"kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.153389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.154467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.171061 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5fnnk"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.172097 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.184050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5fnnk"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.186362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5xp\" (UniqueName: \"kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp\") pod \"keystone-6259-account-create-update-5zpf8\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.254946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.255231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.255300 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkc5\" (UniqueName: \"kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.255323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk528\" (UniqueName: \"kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.256339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.278508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk528\" (UniqueName: \"kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528\") pod \"keystone-db-create-mqghp\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.286964 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-39fa-account-create-update-2sll4"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.287978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.290125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.295963 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-39fa-account-create-update-2sll4"] Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.311005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.339033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.357511 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8zv\" (UniqueName: \"kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.357574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.357630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.357780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lkc5\" (UniqueName: \"kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.358253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.381514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lkc5\" (UniqueName: \"kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5\") pod \"placement-db-create-5fnnk\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.460219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8zv\" (UniqueName: \"kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.460620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.461283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.486391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8zv\" (UniqueName: \"kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv\") pod \"placement-39fa-account-create-update-2sll4\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.555158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.616250 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.819503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6259-account-create-update-5zpf8"] Feb 25 07:35:56 crc kubenswrapper[4749]: W0225 07:35:56.825188 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625ec738_a35a_4a37_ab15_63334e614c88.slice/crio-a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe WatchSource:0}: Error finding container a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe: Status 404 returned error can't find the container with id a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe Feb 25 07:35:56 crc kubenswrapper[4749]: I0225 07:35:56.910530 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mqghp"] Feb 25 07:35:56 crc kubenswrapper[4749]: W0225 07:35:56.924787 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod122084b5_9e7b_4874_86db_10ae68b0c801.slice/crio-fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e WatchSource:0}: Error finding container fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e: Status 404 returned error can't find the container with id fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.022443 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5fnnk"] Feb 25 07:35:57 crc kubenswrapper[4749]: W0225 07:35:57.040827 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29ec8fbf_7999_4ba2_a4a0_cf742f7317dc.slice/crio-accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1 WatchSource:0}: Error finding container accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1: Status 404 returned error can't find the container with id accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1 Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.105521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6259-account-create-update-5zpf8" event={"ID":"625ec738-a35a-4a37-ab15-63334e614c88","Type":"ContainerStarted","Data":"81cef26eca6a22bcd0bfee9374f35ac974f3ff6768a118d779ddfb7a075e8875"} Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.105565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6259-account-create-update-5zpf8" event={"ID":"625ec738-a35a-4a37-ab15-63334e614c88","Type":"ContainerStarted","Data":"a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe"} Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.110413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqghp" event={"ID":"122084b5-9e7b-4874-86db-10ae68b0c801","Type":"ContainerStarted","Data":"fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e"} Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.114078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fnnk" event={"ID":"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc","Type":"ContainerStarted","Data":"accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1"} Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.146304 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6259-account-create-update-5zpf8" podStartSLOduration=2.146288703 podStartE2EDuration="2.146288703s" podCreationTimestamp="2026-02-25 07:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:35:57.135225048 +0000 UTC m=+1110.497051078" watchObservedRunningTime="2026-02-25 07:35:57.146288703 +0000 UTC m=+1110.508114723" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.158157 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.159625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-39fa-account-create-update-2sll4"] Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.242266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.242740 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-wcbx6" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="dnsmasq-dns" containerID="cri-o://5dc21ca65a23c4b4f9ebbca185bbfbcc4fe36e5e88a1158fa1026d7ca40fbaff" gracePeriod=10 Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.251847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.288572 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.289830 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.301797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.382171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.382228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.382271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.382295 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgmgw\" (UniqueName: \"kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.382327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.472835 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.485452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.485511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgmgw\" (UniqueName: \"kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.485574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.486478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.487112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.491953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.492033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.492879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.493179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.528035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgmgw\" (UniqueName: \"kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw\") pod \"dnsmasq-dns-b8fbc5445-nzpgj\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.609673 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:35:57 crc kubenswrapper[4749]: I0225 07:35:57.637468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.068992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:35:58 crc kubenswrapper[4749]: W0225 07:35:58.071941 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f2f496a_5971_447e_887f_9d3c5e70f9ea.slice/crio-1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa WatchSource:0}: Error finding container 1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa: Status 404 returned error can't find the container with id 1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.121616 4749 generic.go:334] "Generic (PLEG): container finished" podID="29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" containerID="a3521c34d5ad33c1c4c8a5d9b95b563c4893620005cc23ac2b747d35d875aa71" exitCode=0 Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.121690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fnnk" event={"ID":"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc","Type":"ContainerDied","Data":"a3521c34d5ad33c1c4c8a5d9b95b563c4893620005cc23ac2b747d35d875aa71"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.123561 4749 generic.go:334] "Generic (PLEG): container finished" podID="625ec738-a35a-4a37-ab15-63334e614c88" containerID="81cef26eca6a22bcd0bfee9374f35ac974f3ff6768a118d779ddfb7a075e8875" exitCode=0 Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.123692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6259-account-create-update-5zpf8" event={"ID":"625ec738-a35a-4a37-ab15-63334e614c88","Type":"ContainerDied","Data":"81cef26eca6a22bcd0bfee9374f35ac974f3ff6768a118d779ddfb7a075e8875"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.126729 4749 generic.go:334] "Generic (PLEG): container finished" podID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerID="5dc21ca65a23c4b4f9ebbca185bbfbcc4fe36e5e88a1158fa1026d7ca40fbaff" exitCode=0 Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.126765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wcbx6" event={"ID":"841e48d4-83bd-4321-81bf-a97e51697a0f","Type":"ContainerDied","Data":"5dc21ca65a23c4b4f9ebbca185bbfbcc4fe36e5e88a1158fa1026d7ca40fbaff"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.128128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" event={"ID":"2f2f496a-5971-447e-887f-9d3c5e70f9ea","Type":"ContainerStarted","Data":"1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.133270 4749 generic.go:334] "Generic (PLEG): container finished" podID="122084b5-9e7b-4874-86db-10ae68b0c801" containerID="877034bdebe9af498641ea1384a3d14c6780bbeba64f9119e1323a225b912fdd" exitCode=0 Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.133307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqghp" event={"ID":"122084b5-9e7b-4874-86db-10ae68b0c801","Type":"ContainerDied","Data":"877034bdebe9af498641ea1384a3d14c6780bbeba64f9119e1323a225b912fdd"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.135030 4749 generic.go:334] "Generic (PLEG): container finished" podID="3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" containerID="a9ad0337906e9bd2757ef085f0fe39f17c49743a19bd35f1e6f151c465337a64" exitCode=0 Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.135261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-39fa-account-create-update-2sll4" event={"ID":"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d","Type":"ContainerDied","Data":"a9ad0337906e9bd2757ef085f0fe39f17c49743a19bd35f1e6f151c465337a64"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.135282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-39fa-account-create-update-2sll4" event={"ID":"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d","Type":"ContainerStarted","Data":"d7731e19961f4263f3754f78e1ec10d167c8ddd5c6978f2327a6910cd861908d"} Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.142788 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.306618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njp7x\" (UniqueName: \"kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x\") pod \"841e48d4-83bd-4321-81bf-a97e51697a0f\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.306752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb\") pod \"841e48d4-83bd-4321-81bf-a97e51697a0f\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.306825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config\") pod \"841e48d4-83bd-4321-81bf-a97e51697a0f\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.306927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb\") pod \"841e48d4-83bd-4321-81bf-a97e51697a0f\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.306970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc\") pod \"841e48d4-83bd-4321-81bf-a97e51697a0f\" (UID: \"841e48d4-83bd-4321-81bf-a97e51697a0f\") " Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.309965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x" (OuterVolumeSpecName: "kube-api-access-njp7x") pod "841e48d4-83bd-4321-81bf-a97e51697a0f" (UID: "841e48d4-83bd-4321-81bf-a97e51697a0f"). InnerVolumeSpecName "kube-api-access-njp7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.340498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "841e48d4-83bd-4321-81bf-a97e51697a0f" (UID: "841e48d4-83bd-4321-81bf-a97e51697a0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.345129 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config" (OuterVolumeSpecName: "config") pod "841e48d4-83bd-4321-81bf-a97e51697a0f" (UID: "841e48d4-83bd-4321-81bf-a97e51697a0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.348733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "841e48d4-83bd-4321-81bf-a97e51697a0f" (UID: "841e48d4-83bd-4321-81bf-a97e51697a0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.352182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "841e48d4-83bd-4321-81bf-a97e51697a0f" (UID: "841e48d4-83bd-4321-81bf-a97e51697a0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.409009 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.409211 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.409369 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njp7x\" (UniqueName: \"kubernetes.io/projected/841e48d4-83bd-4321-81bf-a97e51697a0f-kube-api-access-njp7x\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.409508 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.409665 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841e48d4-83bd-4321-81bf-a97e51697a0f-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.418515 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 25 07:35:58 crc kubenswrapper[4749]: E0225 07:35:58.419912 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="init" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.419926 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="init" Feb 25 07:35:58 crc kubenswrapper[4749]: E0225 07:35:58.419939 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="dnsmasq-dns" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.419945 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="dnsmasq-dns" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.420155 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" containerName="dnsmasq-dns" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.424678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.433978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.434173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.434340 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2kfsz" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.434515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.514870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhjh\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-kube-api-access-kwhjh\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.514923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-lock\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.514968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.515025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4664b9ba-7c02-431a-89c5-715d216ee127-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.515072 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-cache\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.515111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.517057 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhjh\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-kube-api-access-kwhjh\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-lock\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4664b9ba-7c02-431a-89c5-715d216ee127-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-cache\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: E0225 07:35:58.616305 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 07:35:58 crc kubenswrapper[4749]: E0225 07:35:58.616317 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 07:35:58 crc kubenswrapper[4749]: E0225 07:35:58.616359 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift podName:4664b9ba-7c02-431a-89c5-715d216ee127 nodeName:}" failed. No retries permitted until 2026-02-25 07:35:59.116342759 +0000 UTC m=+1112.478168779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift") pod "swift-storage-0" (UID: "4664b9ba-7c02-431a-89c5-715d216ee127") : configmap "swift-ring-files" not found Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-lock\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.616903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4664b9ba-7c02-431a-89c5-715d216ee127-cache\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.617028 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.621101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4664b9ba-7c02-431a-89c5-715d216ee127-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.638677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhjh\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-kube-api-access-kwhjh\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.641466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.986406 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tw69n"] Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.988013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.991165 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.992529 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 25 07:35:58 crc kubenswrapper[4749]: I0225 07:35:58.992921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.006408 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tw69n"] Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.123892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.123990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: E0225 07:35:59.124110 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 07:35:59 crc kubenswrapper[4749]: E0225 07:35:59.124137 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: E0225 07:35:59.124191 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift podName:4664b9ba-7c02-431a-89c5-715d216ee127 nodeName:}" failed. No retries permitted until 2026-02-25 07:36:00.124171769 +0000 UTC m=+1113.485997789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift") pod "swift-storage-0" (UID: "4664b9ba-7c02-431a-89c5-715d216ee127") : configmap "swift-ring-files" not found Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.124571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4sxd\" (UniqueName: \"kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.143682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-wcbx6" event={"ID":"841e48d4-83bd-4321-81bf-a97e51697a0f","Type":"ContainerDied","Data":"f258ccbb12acd4856f4ccd2035d789a9d6f700cf051e4d58054ded05519e1b82"} Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.143736 4749 scope.go:117] "RemoveContainer" containerID="5dc21ca65a23c4b4f9ebbca185bbfbcc4fe36e5e88a1158fa1026d7ca40fbaff" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.143829 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-wcbx6" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.145809 4749 generic.go:334] "Generic (PLEG): container finished" podID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerID="14150f4048543b1bea80e7f17add36062f65116b769b582813183a83c08b3432" exitCode=0 Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.145902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" event={"ID":"2f2f496a-5971-447e-887f-9d3c5e70f9ea","Type":"ContainerDied","Data":"14150f4048543b1bea80e7f17add36062f65116b769b582813183a83c08b3432"} Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.187768 4749 scope.go:117] "RemoveContainer" containerID="bbc1e21e2179a3fa98dbcc47a9d0d51802368d9522d64017f8b72596383e4500" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.227501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4sxd\" (UniqueName: \"kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.227903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.227984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.228023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.228077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.228113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.228140 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.228488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.229249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.229808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.230973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.233067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.233077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.249269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4sxd\" (UniqueName: \"kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd\") pod \"swift-ring-rebalance-tw69n\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.309511 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.314365 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.335612 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-wcbx6"] Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.486369 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fnnk" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.532198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts\") pod \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.532406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lkc5\" (UniqueName: \"kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5\") pod \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\" (UID: \"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.533142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" (UID: "29ec8fbf-7999-4ba2-a4a0-cf742f7317dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.545264 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5" (OuterVolumeSpecName: "kube-api-access-4lkc5") pod "29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" (UID: "29ec8fbf-7999-4ba2-a4a0-cf742f7317dc"). InnerVolumeSpecName "kube-api-access-4lkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.634247 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.634284 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lkc5\" (UniqueName: \"kubernetes.io/projected/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc-kube-api-access-4lkc5\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.711108 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.712494 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqghp" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.717034 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.836928 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts\") pod \"625ec738-a35a-4a37-ab15-63334e614c88\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.836993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts\") pod \"122084b5-9e7b-4874-86db-10ae68b0c801\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.837126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts\") pod \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.837184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt5xp\" (UniqueName: \"kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp\") pod \"625ec738-a35a-4a37-ab15-63334e614c88\" (UID: \"625ec738-a35a-4a37-ab15-63334e614c88\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.837237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk528\" (UniqueName: \"kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528\") pod \"122084b5-9e7b-4874-86db-10ae68b0c801\" (UID: \"122084b5-9e7b-4874-86db-10ae68b0c801\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.837332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b8zv\" (UniqueName: \"kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv\") pod \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\" (UID: \"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d\") " Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.837842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "625ec738-a35a-4a37-ab15-63334e614c88" (UID: "625ec738-a35a-4a37-ab15-63334e614c88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.838394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" (UID: "3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.838800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "122084b5-9e7b-4874-86db-10ae68b0c801" (UID: "122084b5-9e7b-4874-86db-10ae68b0c801"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.840782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp" (OuterVolumeSpecName: "kube-api-access-gt5xp") pod "625ec738-a35a-4a37-ab15-63334e614c88" (UID: "625ec738-a35a-4a37-ab15-63334e614c88"). InnerVolumeSpecName "kube-api-access-gt5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.841012 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv" (OuterVolumeSpecName: "kube-api-access-4b8zv") pod "3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" (UID: "3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d"). InnerVolumeSpecName "kube-api-access-4b8zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.843391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528" (OuterVolumeSpecName: "kube-api-access-gk528") pod "122084b5-9e7b-4874-86db-10ae68b0c801" (UID: "122084b5-9e7b-4874-86db-10ae68b0c801"). InnerVolumeSpecName "kube-api-access-gk528". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.884007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tw69n"] Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940215 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940666 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt5xp\" (UniqueName: \"kubernetes.io/projected/625ec738-a35a-4a37-ab15-63334e614c88-kube-api-access-gt5xp\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940686 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk528\" (UniqueName: \"kubernetes.io/projected/122084b5-9e7b-4874-86db-10ae68b0c801-kube-api-access-gk528\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b8zv\" (UniqueName: \"kubernetes.io/projected/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d-kube-api-access-4b8zv\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940723 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/625ec738-a35a-4a37-ab15-63334e614c88-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:35:59 crc kubenswrapper[4749]: I0225 07:35:59.940739 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122084b5-9e7b-4874-86db-10ae68b0c801-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.132143 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533416-7mtkv"] Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.132717 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.132747 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.132782 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122084b5-9e7b-4874-86db-10ae68b0c801" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.132793 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="122084b5-9e7b-4874-86db-10ae68b0c801" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.132816 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625ec738-a35a-4a37-ab15-63334e614c88" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.132828 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="625ec738-a35a-4a37-ab15-63334e614c88" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.132879 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.132892 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.133129 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="122084b5-9e7b-4874-86db-10ae68b0c801" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.133158 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" containerName="mariadb-database-create" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.133178 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="625ec738-a35a-4a37-ab15-63334e614c88" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.133196 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" containerName="mariadb-account-create-update" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.133992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.136724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.137376 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.140882 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.143615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.143792 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.143806 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 07:36:00 crc kubenswrapper[4749]: E0225 07:36:00.143847 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift podName:4664b9ba-7c02-431a-89c5-715d216ee127 nodeName:}" failed. No retries permitted until 2026-02-25 07:36:02.143832927 +0000 UTC m=+1115.505658947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift") pod "swift-storage-0" (UID: "4664b9ba-7c02-431a-89c5-715d216ee127") : configmap "swift-ring-files" not found Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.151482 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533416-7mtkv"] Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.159792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tw69n" event={"ID":"9d9bb0e5-da1f-480e-8a59-e00767290acc","Type":"ContainerStarted","Data":"3cc9487d5df801cc34ade6877bf07f23835c7e73ee67366ae95914a0a3ff5a80"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.161200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5fnnk" event={"ID":"29ec8fbf-7999-4ba2-a4a0-cf742f7317dc","Type":"ContainerDied","Data":"accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.161251 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="accf5c9266e48ae4d26f8098c9ebf4f8bac09d7ea6c5934cfcc2f59e3e0a50a1" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.161206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5fnnk" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.162447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6259-account-create-update-5zpf8" event={"ID":"625ec738-a35a-4a37-ab15-63334e614c88","Type":"ContainerDied","Data":"a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.162488 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74d86ce092e01f01f62b9e8d96bc4a6d19ae4edd4d1c0f343b2a8888a86acbe" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.162569 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6259-account-create-update-5zpf8" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.168198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" event={"ID":"2f2f496a-5971-447e-887f-9d3c5e70f9ea","Type":"ContainerStarted","Data":"233fe70f6ad5d3572d4afc1feaeef24255f05ce045d2cbabbf5c417bd040eddd"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.168721 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.169979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mqghp" event={"ID":"122084b5-9e7b-4874-86db-10ae68b0c801","Type":"ContainerDied","Data":"fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.170022 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdec01f8918def9a4c4f82fa51f2cee01427d2d51a3d2a6b6dcc092b2ad45b2e" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.170098 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mqghp" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.174036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-39fa-account-create-update-2sll4" event={"ID":"3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d","Type":"ContainerDied","Data":"d7731e19961f4263f3754f78e1ec10d167c8ddd5c6978f2327a6910cd861908d"} Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.174076 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7731e19961f4263f3754f78e1ec10d167c8ddd5c6978f2327a6910cd861908d" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.174129 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-39fa-account-create-update-2sll4" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.198157 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" podStartSLOduration=3.198137721 podStartE2EDuration="3.198137721s" podCreationTimestamp="2026-02-25 07:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:00.187675681 +0000 UTC m=+1113.549501701" watchObservedRunningTime="2026-02-25 07:36:00.198137721 +0000 UTC m=+1113.559963751" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.261973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dzd\" (UniqueName: \"kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd\") pod \"auto-csr-approver-29533416-7mtkv\" (UID: \"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f\") " pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.272958 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-48962"] Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.277508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.282452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-48962"] Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.365606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.366440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dzd\" (UniqueName: \"kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd\") pod \"auto-csr-approver-29533416-7mtkv\" (UID: \"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f\") " pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.366543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49z78\" (UniqueName: \"kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.380404 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ca9f-account-create-update-5kb9h"] Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.381476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.383444 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.418830 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dzd\" (UniqueName: \"kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd\") pod \"auto-csr-approver-29533416-7mtkv\" (UID: \"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f\") " pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.420924 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ca9f-account-create-update-5kb9h"] Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.460399 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.468679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28cqq\" (UniqueName: \"kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.469355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.469487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.469532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49z78\" (UniqueName: \"kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.470723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.501072 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49z78\" (UniqueName: \"kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78\") pod \"glance-db-create-48962\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.570852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28cqq\" (UniqueName: \"kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.571384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.572340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.590861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-48962" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.593008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28cqq\" (UniqueName: \"kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq\") pod \"glance-ca9f-account-create-update-5kb9h\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.696851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.719019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533416-7mtkv"] Feb 25 07:36:00 crc kubenswrapper[4749]: W0225 07:36:00.728835 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b16ca14_bc0d_48c3_aa38_0af73eb7ca6f.slice/crio-a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d WatchSource:0}: Error finding container a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d: Status 404 returned error can't find the container with id a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d Feb 25 07:36:00 crc kubenswrapper[4749]: I0225 07:36:00.860302 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-48962"] Feb 25 07:36:00 crc kubenswrapper[4749]: W0225 07:36:00.867272 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde31f747_1a7d_4950_93b5_88938e84e33e.slice/crio-f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e WatchSource:0}: Error finding container f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e: Status 404 returned error can't find the container with id f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.151282 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ca9f-account-create-update-5kb9h"] Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.184701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" event={"ID":"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f","Type":"ContainerStarted","Data":"a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d"} Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.188300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca9f-account-create-update-5kb9h" event={"ID":"cfb70cdb-8655-4c36-8c0c-8bde033488ad","Type":"ContainerStarted","Data":"3f7df79570dc6995a6ce82e864dfdb38e2365380e499f3af13a7571535d4cfe8"} Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.192565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-48962" event={"ID":"de31f747-1a7d-4950-93b5-88938e84e33e","Type":"ContainerStarted","Data":"ec22a47aecb1976c825d82270f902e7da78446566dcbe2062f91a67cb37acfee"} Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.192655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-48962" event={"ID":"de31f747-1a7d-4950-93b5-88938e84e33e","Type":"ContainerStarted","Data":"f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e"} Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.219205 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-48962" podStartSLOduration=1.219174814 podStartE2EDuration="1.219174814s" podCreationTimestamp="2026-02-25 07:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:01.204444717 +0000 UTC m=+1114.566270737" watchObservedRunningTime="2026-02-25 07:36:01.219174814 +0000 UTC m=+1114.581000844" Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.333319 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841e48d4-83bd-4321-81bf-a97e51697a0f" path="/var/lib/kubelet/pods/841e48d4-83bd-4321-81bf-a97e51697a0f/volumes" Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.846810 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-r2k76"] Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.848900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.852889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r2k76"] Feb 25 07:36:01 crc kubenswrapper[4749]: I0225 07:36:01.853615 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.002134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.002665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.104257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.104401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.105411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.130312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r\") pod \"root-account-create-update-r2k76\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.206722 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.207227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:36:02 crc kubenswrapper[4749]: E0225 07:36:02.207541 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 07:36:02 crc kubenswrapper[4749]: E0225 07:36:02.207585 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 07:36:02 crc kubenswrapper[4749]: E0225 07:36:02.207695 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift podName:4664b9ba-7c02-431a-89c5-715d216ee127 nodeName:}" failed. No retries permitted until 2026-02-25 07:36:06.207664565 +0000 UTC m=+1119.569490625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift") pod "swift-storage-0" (UID: "4664b9ba-7c02-431a-89c5-715d216ee127") : configmap "swift-ring-files" not found Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.209888 4749 generic.go:334] "Generic (PLEG): container finished" podID="de31f747-1a7d-4950-93b5-88938e84e33e" containerID="ec22a47aecb1976c825d82270f902e7da78446566dcbe2062f91a67cb37acfee" exitCode=0 Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.209995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-48962" event={"ID":"de31f747-1a7d-4950-93b5-88938e84e33e","Type":"ContainerDied","Data":"ec22a47aecb1976c825d82270f902e7da78446566dcbe2062f91a67cb37acfee"} Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.214109 4749 generic.go:334] "Generic (PLEG): container finished" podID="cfb70cdb-8655-4c36-8c0c-8bde033488ad" containerID="43b66c1f3b7fd34bedaca82526c00987fa78506682b2ef846fb59b6fd6afd428" exitCode=0 Feb 25 07:36:02 crc kubenswrapper[4749]: I0225 07:36:02.214145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca9f-account-create-update-5kb9h" event={"ID":"cfb70cdb-8655-4c36-8c0c-8bde033488ad","Type":"ContainerDied","Data":"43b66c1f3b7fd34bedaca82526c00987fa78506682b2ef846fb59b6fd6afd428"} Feb 25 07:36:03 crc kubenswrapper[4749]: I0225 07:36:03.915926 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:03 crc kubenswrapper[4749]: I0225 07:36:03.925700 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-48962" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.044043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28cqq\" (UniqueName: \"kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq\") pod \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.044105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts\") pod \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\" (UID: \"cfb70cdb-8655-4c36-8c0c-8bde033488ad\") " Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.044267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts\") pod \"de31f747-1a7d-4950-93b5-88938e84e33e\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.044330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49z78\" (UniqueName: \"kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78\") pod \"de31f747-1a7d-4950-93b5-88938e84e33e\" (UID: \"de31f747-1a7d-4950-93b5-88938e84e33e\") " Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.046404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de31f747-1a7d-4950-93b5-88938e84e33e" (UID: "de31f747-1a7d-4950-93b5-88938e84e33e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.046467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfb70cdb-8655-4c36-8c0c-8bde033488ad" (UID: "cfb70cdb-8655-4c36-8c0c-8bde033488ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.065222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78" (OuterVolumeSpecName: "kube-api-access-49z78") pod "de31f747-1a7d-4950-93b5-88938e84e33e" (UID: "de31f747-1a7d-4950-93b5-88938e84e33e"). InnerVolumeSpecName "kube-api-access-49z78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.072365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq" (OuterVolumeSpecName: "kube-api-access-28cqq") pod "cfb70cdb-8655-4c36-8c0c-8bde033488ad" (UID: "cfb70cdb-8655-4c36-8c0c-8bde033488ad"). InnerVolumeSpecName "kube-api-access-28cqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.146552 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de31f747-1a7d-4950-93b5-88938e84e33e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.146659 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49z78\" (UniqueName: \"kubernetes.io/projected/de31f747-1a7d-4950-93b5-88938e84e33e-kube-api-access-49z78\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.146684 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28cqq\" (UniqueName: \"kubernetes.io/projected/cfb70cdb-8655-4c36-8c0c-8bde033488ad-kube-api-access-28cqq\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.146704 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfb70cdb-8655-4c36-8c0c-8bde033488ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.234350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tw69n" event={"ID":"9d9bb0e5-da1f-480e-8a59-e00767290acc","Type":"ContainerStarted","Data":"956f40ac6e605b683067944921d5c6faa686acdcaf45a61325dd3833b1e9df4d"} Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.235889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" event={"ID":"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f","Type":"ContainerStarted","Data":"4bbc63636967ccf784c0e71258b47391e9620da4850f647fcdd366de88679cd1"} Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.238367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca9f-account-create-update-5kb9h" event={"ID":"cfb70cdb-8655-4c36-8c0c-8bde033488ad","Type":"ContainerDied","Data":"3f7df79570dc6995a6ce82e864dfdb38e2365380e499f3af13a7571535d4cfe8"} Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.238413 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7df79570dc6995a6ce82e864dfdb38e2365380e499f3af13a7571535d4cfe8" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.238505 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca9f-account-create-update-5kb9h" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.248385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-48962" event={"ID":"de31f747-1a7d-4950-93b5-88938e84e33e","Type":"ContainerDied","Data":"f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e"} Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.248441 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44ed79add4fd0af661889f97c1ab705bc63c1673dab93209ae85bdb7819b44e" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.248523 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-48962" Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.267985 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tw69n" podStartSLOduration=2.372016686 podStartE2EDuration="6.267964135s" podCreationTimestamp="2026-02-25 07:35:58 +0000 UTC" firstStartedPulling="2026-02-25 07:35:59.889368904 +0000 UTC m=+1113.251194924" lastFinishedPulling="2026-02-25 07:36:03.785316363 +0000 UTC m=+1117.147142373" observedRunningTime="2026-02-25 07:36:04.253225657 +0000 UTC m=+1117.615051687" watchObservedRunningTime="2026-02-25 07:36:04.267964135 +0000 UTC m=+1117.629790175" Feb 25 07:36:04 crc kubenswrapper[4749]: W0225 07:36:04.273712 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb7dbf6_31d2_47a5_ae8c_cce3ae307520.slice/crio-29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33 WatchSource:0}: Error finding container 29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33: Status 404 returned error can't find the container with id 29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33 Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.279091 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r2k76"] Feb 25 07:36:04 crc kubenswrapper[4749]: I0225 07:36:04.283282 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" podStartSLOduration=1.198103868 podStartE2EDuration="4.283260376s" podCreationTimestamp="2026-02-25 07:36:00 +0000 UTC" firstStartedPulling="2026-02-25 07:36:00.731546168 +0000 UTC m=+1114.093372188" lastFinishedPulling="2026-02-25 07:36:03.816702666 +0000 UTC m=+1117.178528696" observedRunningTime="2026-02-25 07:36:04.282405944 +0000 UTC m=+1117.644231984" watchObservedRunningTime="2026-02-25 07:36:04.283260376 +0000 UTC m=+1117.645086436" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.278779 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" containerID="4bbc63636967ccf784c0e71258b47391e9620da4850f647fcdd366de88679cd1" exitCode=0 Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.278823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" event={"ID":"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f","Type":"ContainerDied","Data":"4bbc63636967ccf784c0e71258b47391e9620da4850f647fcdd366de88679cd1"} Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.292826 4749 generic.go:334] "Generic (PLEG): container finished" podID="ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" containerID="ddb7923ed52544226de56285cb4e6acb84e010d2f881d8f2d34079d76daf5302" exitCode=0 Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.292898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r2k76" event={"ID":"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520","Type":"ContainerDied","Data":"ddb7923ed52544226de56285cb4e6acb84e010d2f881d8f2d34079d76daf5302"} Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.292954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r2k76" event={"ID":"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520","Type":"ContainerStarted","Data":"29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33"} Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.522248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6lvxb"] Feb 25 07:36:05 crc kubenswrapper[4749]: E0225 07:36:05.534211 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb70cdb-8655-4c36-8c0c-8bde033488ad" containerName="mariadb-account-create-update" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.534261 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb70cdb-8655-4c36-8c0c-8bde033488ad" containerName="mariadb-account-create-update" Feb 25 07:36:05 crc kubenswrapper[4749]: E0225 07:36:05.534298 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de31f747-1a7d-4950-93b5-88938e84e33e" containerName="mariadb-database-create" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.534311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="de31f747-1a7d-4950-93b5-88938e84e33e" containerName="mariadb-database-create" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.534585 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb70cdb-8655-4c36-8c0c-8bde033488ad" containerName="mariadb-account-create-update" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.534650 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="de31f747-1a7d-4950-93b5-88938e84e33e" containerName="mariadb-database-create" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.535543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.535757 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6lvxb"] Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.537844 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wjttw" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.541108 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.694537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.694790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.694855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.694886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25vw\" (UniqueName: \"kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.797030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.797149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.797172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.797194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25vw\" (UniqueName: \"kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.805653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.807642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.807649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.827061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25vw\" (UniqueName: \"kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw\") pod \"glance-db-sync-6lvxb\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:05 crc kubenswrapper[4749]: I0225 07:36:05.867451 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.230852 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6lvxb"] Feb 25 07:36:06 crc kubenswrapper[4749]: W0225 07:36:06.236974 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0991404c_02d3_451e_9a9a_fbd93370e965.slice/crio-a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8 WatchSource:0}: Error finding container a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8: Status 404 returned error can't find the container with id a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8 Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.304541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.304926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6lvxb" event={"ID":"0991404c-02d3-451e-9a9a-fbd93370e965","Type":"ContainerStarted","Data":"a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8"} Feb 25 07:36:06 crc kubenswrapper[4749]: E0225 07:36:06.305050 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 25 07:36:06 crc kubenswrapper[4749]: E0225 07:36:06.305076 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 25 07:36:06 crc kubenswrapper[4749]: E0225 07:36:06.305137 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift podName:4664b9ba-7c02-431a-89c5-715d216ee127 nodeName:}" failed. No retries permitted until 2026-02-25 07:36:14.305113747 +0000 UTC m=+1127.666939807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift") pod "swift-storage-0" (UID: "4664b9ba-7c02-431a-89c5-715d216ee127") : configmap "swift-ring-files" not found Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.686716 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.733855 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.812512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts\") pod \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.812622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r\") pod \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\" (UID: \"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520\") " Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.812677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42dzd\" (UniqueName: \"kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd\") pod \"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f\" (UID: \"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f\") " Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.820865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd" (OuterVolumeSpecName: "kube-api-access-42dzd") pod "4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" (UID: "4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f"). InnerVolumeSpecName "kube-api-access-42dzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.825934 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" (UID: "ccb7dbf6-31d2-47a5-ae8c-cce3ae307520"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.830498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r" (OuterVolumeSpecName: "kube-api-access-lt98r") pod "ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" (UID: "ccb7dbf6-31d2-47a5-ae8c-cce3ae307520"). InnerVolumeSpecName "kube-api-access-lt98r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.915187 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.915255 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt98r\" (UniqueName: \"kubernetes.io/projected/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520-kube-api-access-lt98r\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:06 crc kubenswrapper[4749]: I0225 07:36:06.915269 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42dzd\" (UniqueName: \"kubernetes.io/projected/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f-kube-api-access-42dzd\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.335234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r2k76" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.336421 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.337368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r2k76" event={"ID":"ccb7dbf6-31d2-47a5-ae8c-cce3ae307520","Type":"ContainerDied","Data":"29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33"} Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.337403 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29fe403540ad8fb6c88f2fd293fd1f8918af3d761096916c4018a2c602005b33" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.337422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533416-7mtkv" event={"ID":"4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f","Type":"ContainerDied","Data":"a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d"} Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.337464 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1f42f992405e7dda2f17582af4fa1250d782d6ce0b4912504ce3bd87551c06d" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.387325 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533410-8sgtm"] Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.393923 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533410-8sgtm"] Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.612103 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.675650 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:36:07 crc kubenswrapper[4749]: I0225 07:36:07.675988 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="dnsmasq-dns" containerID="cri-o://ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13" gracePeriod=10 Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.121737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.239458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc\") pod \"62112103-059c-4b14-803d-8f6e6189f3b6\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.239533 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rtmr\" (UniqueName: \"kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr\") pod \"62112103-059c-4b14-803d-8f6e6189f3b6\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.239613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config\") pod \"62112103-059c-4b14-803d-8f6e6189f3b6\" (UID: \"62112103-059c-4b14-803d-8f6e6189f3b6\") " Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.244917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr" (OuterVolumeSpecName: "kube-api-access-7rtmr") pod "62112103-059c-4b14-803d-8f6e6189f3b6" (UID: "62112103-059c-4b14-803d-8f6e6189f3b6"). InnerVolumeSpecName "kube-api-access-7rtmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.275501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config" (OuterVolumeSpecName: "config") pod "62112103-059c-4b14-803d-8f6e6189f3b6" (UID: "62112103-059c-4b14-803d-8f6e6189f3b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.277178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62112103-059c-4b14-803d-8f6e6189f3b6" (UID: "62112103-059c-4b14-803d-8f6e6189f3b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.337524 4749 generic.go:334] "Generic (PLEG): container finished" podID="62112103-059c-4b14-803d-8f6e6189f3b6" containerID="ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13" exitCode=0 Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.337561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" event={"ID":"62112103-059c-4b14-803d-8f6e6189f3b6","Type":"ContainerDied","Data":"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13"} Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.337582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" event={"ID":"62112103-059c-4b14-803d-8f6e6189f3b6","Type":"ContainerDied","Data":"1985cc41a20ca0286403f3594cb1e4a5a9dc3690bacf62ccc4b37629d3f201a6"} Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.337616 4749 scope.go:117] "RemoveContainer" containerID="ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.337704 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-kmd6j" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.341509 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.341553 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rtmr\" (UniqueName: \"kubernetes.io/projected/62112103-059c-4b14-803d-8f6e6189f3b6-kube-api-access-7rtmr\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.341567 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62112103-059c-4b14-803d-8f6e6189f3b6-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.370681 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.375569 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-kmd6j"] Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.377626 4749 scope.go:117] "RemoveContainer" containerID="0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.396967 4749 scope.go:117] "RemoveContainer" containerID="ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13" Feb 25 07:36:08 crc kubenswrapper[4749]: E0225 07:36:08.397400 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13\": container with ID starting with ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13 not found: ID does not exist" containerID="ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.397431 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13"} err="failed to get container status \"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13\": rpc error: code = NotFound desc = could not find container \"ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13\": container with ID starting with ebabfe1e13614eea1d23a934a9673013950c8671bd35e22bb3878f63f58cfc13 not found: ID does not exist" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.397452 4749 scope.go:117] "RemoveContainer" containerID="0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a" Feb 25 07:36:08 crc kubenswrapper[4749]: E0225 07:36:08.398259 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a\": container with ID starting with 0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a not found: ID does not exist" containerID="0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.398304 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a"} err="failed to get container status \"0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a\": rpc error: code = NotFound desc = could not find container \"0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a\": container with ID starting with 0a4b5d4a92abacaf31d7de08f893c850cc33b232d21051e453f3a914829e1a7a not found: ID does not exist" Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.435989 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-r2k76"] Feb 25 07:36:08 crc kubenswrapper[4749]: I0225 07:36:08.442913 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-r2k76"] Feb 25 07:36:09 crc kubenswrapper[4749]: I0225 07:36:09.348147 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58920cad-4515-427a-a4fe-1050a058a462" path="/var/lib/kubelet/pods/58920cad-4515-427a-a4fe-1050a058a462/volumes" Feb 25 07:36:09 crc kubenswrapper[4749]: I0225 07:36:09.349718 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" path="/var/lib/kubelet/pods/62112103-059c-4b14-803d-8f6e6189f3b6/volumes" Feb 25 07:36:09 crc kubenswrapper[4749]: I0225 07:36:09.350816 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" path="/var/lib/kubelet/pods/ccb7dbf6-31d2-47a5-ae8c-cce3ae307520/volumes" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.157212 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.379072 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d9bb0e5-da1f-480e-8a59-e00767290acc" containerID="956f40ac6e605b683067944921d5c6faa686acdcaf45a61325dd3833b1e9df4d" exitCode=0 Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.379132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tw69n" event={"ID":"9d9bb0e5-da1f-480e-8a59-e00767290acc","Type":"ContainerDied","Data":"956f40ac6e605b683067944921d5c6faa686acdcaf45a61325dd3833b1e9df4d"} Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863302 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8qtd2"] Feb 25 07:36:11 crc kubenswrapper[4749]: E0225 07:36:11.863683 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="init" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863697 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="init" Feb 25 07:36:11 crc kubenswrapper[4749]: E0225 07:36:11.863707 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" containerName="mariadb-account-create-update" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863713 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" containerName="mariadb-account-create-update" Feb 25 07:36:11 crc kubenswrapper[4749]: E0225 07:36:11.863729 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="dnsmasq-dns" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863737 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="dnsmasq-dns" Feb 25 07:36:11 crc kubenswrapper[4749]: E0225 07:36:11.863752 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" containerName="oc" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863758 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" containerName="oc" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863910 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="62112103-059c-4b14-803d-8f6e6189f3b6" containerName="dnsmasq-dns" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863921 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb7dbf6-31d2-47a5-ae8c-cce3ae307520" containerName="mariadb-account-create-update" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.863934 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" containerName="oc" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.864467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.865713 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 07:36:11 crc kubenswrapper[4749]: I0225 07:36:11.876202 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8qtd2"] Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.020680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.020866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchkp\" (UniqueName: \"kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.123143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.123957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchkp\" (UniqueName: \"kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.124237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.151314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchkp\" (UniqueName: \"kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp\") pod \"root-account-create-update-8qtd2\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:12 crc kubenswrapper[4749]: I0225 07:36:12.180969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:14 crc kubenswrapper[4749]: I0225 07:36:14.368385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:36:14 crc kubenswrapper[4749]: I0225 07:36:14.394516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4664b9ba-7c02-431a-89c5-715d216ee127-etc-swift\") pod \"swift-storage-0\" (UID: \"4664b9ba-7c02-431a-89c5-715d216ee127\") " pod="openstack/swift-storage-0" Feb 25 07:36:14 crc kubenswrapper[4749]: I0225 07:36:14.419915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 25 07:36:16 crc kubenswrapper[4749]: I0225 07:36:16.723066 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9jt2g" podUID="4197a74b-885d-41df-8484-05e645656b2a" containerName="ovn-controller" probeResult="failure" output=< Feb 25 07:36:16 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 25 07:36:16 crc kubenswrapper[4749]: > Feb 25 07:36:16 crc kubenswrapper[4749]: I0225 07:36:16.734499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:36:16 crc kubenswrapper[4749]: I0225 07:36:16.757816 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-c5k7v" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.002039 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jt2g-config-wfm7b"] Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.003463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.010193 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.035074 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-wfm7b"] Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9znvk\" (UniqueName: \"kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.146529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9znvk\" (UniqueName: \"kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.247909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.248264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.248338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.248406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.249335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.253042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.287626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9znvk\" (UniqueName: \"kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk\") pod \"ovn-controller-9jt2g-config-wfm7b\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:17 crc kubenswrapper[4749]: I0225 07:36:17.333459 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.452970 4749 generic.go:334] "Generic (PLEG): container finished" podID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerID="be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c" exitCode=0 Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.453079 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerDied","Data":"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c"} Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.455764 4749 generic.go:334] "Generic (PLEG): container finished" podID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerID="e913d3a116a8d0df455db3e543a9b853ca9274c3337b06248d654bc6853a9e36" exitCode=0 Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.455790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerDied","Data":"e913d3a116a8d0df455db3e543a9b853ca9274c3337b06248d654bc6853a9e36"} Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.811579 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.994694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.994980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995110 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4sxd\" (UniqueName: \"kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995155 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf\") pod \"9d9bb0e5-da1f-480e-8a59-e00767290acc\" (UID: \"9d9bb0e5-da1f-480e-8a59-e00767290acc\") " Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.995894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:18 crc kubenswrapper[4749]: I0225 07:36:18.996732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.002266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd" (OuterVolumeSpecName: "kube-api-access-w4sxd") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "kube-api-access-w4sxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.005249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.017729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.018527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts" (OuterVolumeSpecName: "scripts") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.023403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9d9bb0e5-da1f-480e-8a59-e00767290acc" (UID: "9d9bb0e5-da1f-480e-8a59-e00767290acc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096723 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4sxd\" (UniqueName: \"kubernetes.io/projected/9d9bb0e5-da1f-480e-8a59-e00767290acc-kube-api-access-w4sxd\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096764 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096777 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096787 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096795 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9d9bb0e5-da1f-480e-8a59-e00767290acc-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096803 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d9bb0e5-da1f-480e-8a59-e00767290acc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.096810 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9d9bb0e5-da1f-480e-8a59-e00767290acc-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.223214 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-wfm7b"] Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.295971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8qtd2"] Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.384236 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 25 07:36:19 crc kubenswrapper[4749]: W0225 07:36:19.391927 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4664b9ba_7c02_431a_89c5_715d216ee127.slice/crio-0e58e6077842e80662468f78e05263ad7e2d9e799f1f4063a2ee70e5fc72e200 WatchSource:0}: Error finding container 0e58e6077842e80662468f78e05263ad7e2d9e799f1f4063a2ee70e5fc72e200: Status 404 returned error can't find the container with id 0e58e6077842e80662468f78e05263ad7e2d9e799f1f4063a2ee70e5fc72e200 Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.465319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerStarted","Data":"d218f78e714e77fdd26f9bd4efb44bcccf948b69774525348e530df7b0a6967d"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.465579 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.467161 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tw69n" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.467027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tw69n" event={"ID":"9d9bb0e5-da1f-480e-8a59-e00767290acc","Type":"ContainerDied","Data":"3cc9487d5df801cc34ade6877bf07f23835c7e73ee67366ae95914a0a3ff5a80"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.467529 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc9487d5df801cc34ade6877bf07f23835c7e73ee67366ae95914a0a3ff5a80" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.469617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8qtd2" event={"ID":"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54","Type":"ContainerStarted","Data":"ae7cadc027b19c16b295480ff12a0a73094760acf305a0b20f94843cd8c81398"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.471206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-wfm7b" event={"ID":"9b4a3071-2348-4eeb-86fa-2475e1744fdb","Type":"ContainerStarted","Data":"5dd9ab1a7756ab901d0ab5a685597268d34bf741c2e8c0ad77f223e3d217fc22"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.472270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"0e58e6077842e80662468f78e05263ad7e2d9e799f1f4063a2ee70e5fc72e200"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.477761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6lvxb" event={"ID":"0991404c-02d3-451e-9a9a-fbd93370e965","Type":"ContainerStarted","Data":"693e20096bb732c62ffb9d0b2bdfd8f9e6deee6e70a9b215dfa5163ac7abadae"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.480294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerStarted","Data":"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df"} Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.480479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.494059 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.761610648 podStartE2EDuration="59.494039712s" podCreationTimestamp="2026-02-25 07:35:20 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.172808637 +0000 UTC m=+1088.534634657" lastFinishedPulling="2026-02-25 07:35:42.905237701 +0000 UTC m=+1096.267063721" observedRunningTime="2026-02-25 07:36:19.491143929 +0000 UTC m=+1132.852969949" watchObservedRunningTime="2026-02-25 07:36:19.494039712 +0000 UTC m=+1132.855865722" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.521554 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.56846274 podStartE2EDuration="59.521539247s" podCreationTimestamp="2026-02-25 07:35:20 +0000 UTC" firstStartedPulling="2026-02-25 07:35:35.142937181 +0000 UTC m=+1088.504763211" lastFinishedPulling="2026-02-25 07:35:43.096013708 +0000 UTC m=+1096.457839718" observedRunningTime="2026-02-25 07:36:19.516008819 +0000 UTC m=+1132.877834839" watchObservedRunningTime="2026-02-25 07:36:19.521539247 +0000 UTC m=+1132.883365267" Feb 25 07:36:19 crc kubenswrapper[4749]: I0225 07:36:19.854894 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6lvxb" podStartSLOduration=2.306255232 podStartE2EDuration="14.854877076s" podCreationTimestamp="2026-02-25 07:36:05 +0000 UTC" firstStartedPulling="2026-02-25 07:36:06.239776668 +0000 UTC m=+1119.601602688" lastFinishedPulling="2026-02-25 07:36:18.788398512 +0000 UTC m=+1132.150224532" observedRunningTime="2026-02-25 07:36:19.536683515 +0000 UTC m=+1132.898509535" watchObservedRunningTime="2026-02-25 07:36:19.854877076 +0000 UTC m=+1133.216703096" Feb 25 07:36:20 crc kubenswrapper[4749]: I0225 07:36:20.498357 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" containerID="f813cb8cf0c5a0e1efe5e8d0320065344505244b9770bc5d44ccafb102e9d908" exitCode=0 Feb 25 07:36:20 crc kubenswrapper[4749]: I0225 07:36:20.498417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8qtd2" event={"ID":"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54","Type":"ContainerDied","Data":"f813cb8cf0c5a0e1efe5e8d0320065344505244b9770bc5d44ccafb102e9d908"} Feb 25 07:36:20 crc kubenswrapper[4749]: I0225 07:36:20.503604 4749 generic.go:334] "Generic (PLEG): container finished" podID="9b4a3071-2348-4eeb-86fa-2475e1744fdb" containerID="9a1cf854c42749ee2e4aa4d1793464acb638c97f9dd760d8a76adcb767cda8b9" exitCode=0 Feb 25 07:36:20 crc kubenswrapper[4749]: I0225 07:36:20.504363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-wfm7b" event={"ID":"9b4a3071-2348-4eeb-86fa-2475e1744fdb","Type":"ContainerDied","Data":"9a1cf854c42749ee2e4aa4d1793464acb638c97f9dd760d8a76adcb767cda8b9"} Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.514245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"4be8ab4aa036b6d0ecb3fda0d13336085f422a11aef4e459224d6f97edb4352e"} Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.515925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"60d3b90a66ffc70ad1a668db047f4db76681eafe6fd76235076b4d1ef09a3d33"} Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.516054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"b76c47ed88924c75f1a52fd7d7150fc576a1ac88c70551f39587a02d3e043cf1"} Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.516152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"4ae039692120aef05fc74bc52e3ab44383e239f1efd875990c4c0385a6ef5d1d"} Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.688945 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9jt2g" Feb 25 07:36:21 crc kubenswrapper[4749]: I0225 07:36:21.951037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.072451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.072497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9znvk\" (UniqueName: \"kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.072516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run\") pod \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\" (UID: \"9b4a3071-2348-4eeb-86fa-2475e1744fdb\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073508 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073675 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts" (OuterVolumeSpecName: "scripts") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.073540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run" (OuterVolumeSpecName: "var-run") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.076629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk" (OuterVolumeSpecName: "kube-api-access-9znvk") pod "9b4a3071-2348-4eeb-86fa-2475e1744fdb" (UID: "9b4a3071-2348-4eeb-86fa-2475e1744fdb"). InnerVolumeSpecName "kube-api-access-9znvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.094208 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.174630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tchkp\" (UniqueName: \"kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp\") pod \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.174691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts\") pod \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\" (UID: \"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54\") " Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175222 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175264 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175278 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9znvk\" (UniqueName: \"kubernetes.io/projected/9b4a3071-2348-4eeb-86fa-2475e1744fdb-kube-api-access-9znvk\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175289 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b4a3071-2348-4eeb-86fa-2475e1744fdb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175299 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9b4a3071-2348-4eeb-86fa-2475e1744fdb-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.175836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" (UID: "1e9f2c51-5213-42cd-bf2e-000e3ff7cf54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.178533 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp" (OuterVolumeSpecName: "kube-api-access-tchkp") pod "1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" (UID: "1e9f2c51-5213-42cd-bf2e-000e3ff7cf54"). InnerVolumeSpecName "kube-api-access-tchkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.276687 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tchkp\" (UniqueName: \"kubernetes.io/projected/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-kube-api-access-tchkp\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.276715 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.535486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8qtd2" event={"ID":"1e9f2c51-5213-42cd-bf2e-000e3ff7cf54","Type":"ContainerDied","Data":"ae7cadc027b19c16b295480ff12a0a73094760acf305a0b20f94843cd8c81398"} Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.535804 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7cadc027b19c16b295480ff12a0a73094760acf305a0b20f94843cd8c81398" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.535740 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8qtd2" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.536728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-wfm7b" event={"ID":"9b4a3071-2348-4eeb-86fa-2475e1744fdb","Type":"ContainerDied","Data":"5dd9ab1a7756ab901d0ab5a685597268d34bf741c2e8c0ad77f223e3d217fc22"} Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.536756 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dd9ab1a7756ab901d0ab5a685597268d34bf741c2e8c0ad77f223e3d217fc22" Feb 25 07:36:22 crc kubenswrapper[4749]: I0225 07:36:22.536797 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-wfm7b" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.057807 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9jt2g-config-wfm7b"] Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.064585 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9jt2g-config-wfm7b"] Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105213 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jt2g-config-q5nwj"] Feb 25 07:36:23 crc kubenswrapper[4749]: E0225 07:36:23.105514 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9bb0e5-da1f-480e-8a59-e00767290acc" containerName="swift-ring-rebalance" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105533 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9bb0e5-da1f-480e-8a59-e00767290acc" containerName="swift-ring-rebalance" Feb 25 07:36:23 crc kubenswrapper[4749]: E0225 07:36:23.105544 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" containerName="mariadb-account-create-update" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105552 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" containerName="mariadb-account-create-update" Feb 25 07:36:23 crc kubenswrapper[4749]: E0225 07:36:23.105561 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4a3071-2348-4eeb-86fa-2475e1744fdb" containerName="ovn-config" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105568 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4a3071-2348-4eeb-86fa-2475e1744fdb" containerName="ovn-config" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105734 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4a3071-2348-4eeb-86fa-2475e1744fdb" containerName="ovn-config" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105747 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9bb0e5-da1f-480e-8a59-e00767290acc" containerName="swift-ring-rebalance" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.105760 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" containerName="mariadb-account-create-update" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.106232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.110438 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.163116 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-q5nwj"] Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.190935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c25g\" (UniqueName: \"kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c25g\" (UniqueName: \"kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.293967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.294215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.294268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.294523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.295893 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.316147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c25g\" (UniqueName: \"kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g\") pod \"ovn-controller-9jt2g-config-q5nwj\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.334584 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4a3071-2348-4eeb-86fa-2475e1744fdb" path="/var/lib/kubelet/pods/9b4a3071-2348-4eeb-86fa-2475e1744fdb/volumes" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.422170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.478173 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8qtd2"] Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.488855 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8qtd2"] Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.547570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"d1fa3280fd0f7aeb351a7d03cdf3736a10b72a2f9787d72c695a009a92901318"} Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.547630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"bc0d6b6a87f9f9cfe743de0fe4bec53aa6b36e578c9dd4175e543737930335f9"} Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.547645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"67ffe717dec439ba1fe3cbf0a88ee4a9891daecc49d0949d3b43bb4e999a2458"} Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.547656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"cd01d97394260b60ca2cd75f4bf3b1f30a9db53ad9aa4be9c68455ed9c59acac"} Feb 25 07:36:23 crc kubenswrapper[4749]: I0225 07:36:23.893951 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-q5nwj"] Feb 25 07:36:23 crc kubenswrapper[4749]: W0225 07:36:23.904007 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode55e1d3a_f8eb_412a_899f_4ba7a5255a73.slice/crio-3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de WatchSource:0}: Error finding container 3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de: Status 404 returned error can't find the container with id 3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de Feb 25 07:36:24 crc kubenswrapper[4749]: I0225 07:36:24.557022 4749 generic.go:334] "Generic (PLEG): container finished" podID="e55e1d3a-f8eb-412a-899f-4ba7a5255a73" containerID="d8dcb65447dffa4d38c02e1932c451659fb9b77ae7521f730abb90a55ab9a9cf" exitCode=0 Feb 25 07:36:24 crc kubenswrapper[4749]: I0225 07:36:24.557083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-q5nwj" event={"ID":"e55e1d3a-f8eb-412a-899f-4ba7a5255a73","Type":"ContainerDied","Data":"d8dcb65447dffa4d38c02e1932c451659fb9b77ae7521f730abb90a55ab9a9cf"} Feb 25 07:36:24 crc kubenswrapper[4749]: I0225 07:36:24.557365 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-q5nwj" event={"ID":"e55e1d3a-f8eb-412a-899f-4ba7a5255a73","Type":"ContainerStarted","Data":"3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de"} Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.332492 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9f2c51-5213-42cd-bf2e-000e3ff7cf54" path="/var/lib/kubelet/pods/1e9f2c51-5213-42cd-bf2e-000e3ff7cf54/volumes" Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.570942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"a102cf0074771ff18b95da1e727157f9624f3290458dc7d519b5b73e3e96317d"} Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.570990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"92b8e3f5fb36eb7396c02d647e5d41e9b499ef9f8bf5502458b9590569fd8c5e"} Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.571003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"f2e401f53f91159c01a80e9590384b296d3f48809123bb582bbbc9dfd597334c"} Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.571015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"b0bb0f79e5e100ced1de76031d271fda129f0001b2f6c78f74ed13980b3d7d93"} Feb 25 07:36:25 crc kubenswrapper[4749]: I0225 07:36:25.978114 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c25g\" (UniqueName: \"kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts\") pod \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\" (UID: \"e55e1d3a-f8eb-412a-899f-4ba7a5255a73\") " Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134815 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.134831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run" (OuterVolumeSpecName: "var-run") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.135159 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.135198 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.135723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.136422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts" (OuterVolumeSpecName: "scripts") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.139774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g" (OuterVolumeSpecName: "kube-api-access-7c25g") pod "e55e1d3a-f8eb-412a-899f-4ba7a5255a73" (UID: "e55e1d3a-f8eb-412a-899f-4ba7a5255a73"). InnerVolumeSpecName "kube-api-access-7c25g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.238221 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c25g\" (UniqueName: \"kubernetes.io/projected/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-kube-api-access-7c25g\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.238309 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.238365 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.238388 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e55e1d3a-f8eb-412a-899f-4ba7a5255a73-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.591232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"6c6a09cd1b608f7a89a43cb0de4604976a9a400e83a32098f057f4b2fbfbe731"} Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.591763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"f5ef1be60f93970f15158d563986b317ed39f564db815ce7832f7dde2798a09a"} Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.594491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-q5nwj" event={"ID":"e55e1d3a-f8eb-412a-899f-4ba7a5255a73","Type":"ContainerDied","Data":"3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de"} Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.594536 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b0fdde655e4e7e07bc44a25bfe9165ebfdc48ac1eca245ffde208e8498ab4de" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.594636 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-q5nwj" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.926169 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fj2nl"] Feb 25 07:36:26 crc kubenswrapper[4749]: E0225 07:36:26.926646 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55e1d3a-f8eb-412a-899f-4ba7a5255a73" containerName="ovn-config" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.926660 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55e1d3a-f8eb-412a-899f-4ba7a5255a73" containerName="ovn-config" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.926837 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55e1d3a-f8eb-412a-899f-4ba7a5255a73" containerName="ovn-config" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.927414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.931765 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 07:36:26 crc kubenswrapper[4749]: I0225 07:36:26.938110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fj2nl"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.055165 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9jt2g-config-q5nwj"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.056358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9r5\" (UniqueName: \"kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.056769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.062271 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9jt2g-config-q5nwj"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.158161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.158204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9r5\" (UniqueName: \"kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.159047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.183139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9r5\" (UniqueName: \"kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5\") pod \"root-account-create-update-fj2nl\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.210326 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jt2g-config-w5hfh"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.211254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.213579 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.228084 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-w5hfh"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.246508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.365354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.366955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.367080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.367125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.367198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.367274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26pj6\" (UniqueName: \"kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.396807 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55e1d3a-f8eb-412a-899f-4ba7a5255a73" path="/var/lib/kubelet/pods/e55e1d3a-f8eb-412a-899f-4ba7a5255a73/volumes" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26pj6\" (UniqueName: \"kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.470578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.471368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.471471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.471586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.472972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.473493 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.482122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.498056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26pj6\" (UniqueName: \"kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6\") pod \"ovn-controller-9jt2g-config-w5hfh\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.534232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.606770 4749 generic.go:334] "Generic (PLEG): container finished" podID="0991404c-02d3-451e-9a9a-fbd93370e965" containerID="693e20096bb732c62ffb9d0b2bdfd8f9e6deee6e70a9b215dfa5163ac7abadae" exitCode=0 Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.606825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6lvxb" event={"ID":"0991404c-02d3-451e-9a9a-fbd93370e965","Type":"ContainerDied","Data":"693e20096bb732c62ffb9d0b2bdfd8f9e6deee6e70a9b215dfa5163ac7abadae"} Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.614550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4664b9ba-7c02-431a-89c5-715d216ee127","Type":"ContainerStarted","Data":"3dc75fa25e13bb18c920f658dfe27b99f3e07bdfceab2b1b60d1ff27d9103fe9"} Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.666865 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=25.711527198 podStartE2EDuration="30.666850125s" podCreationTimestamp="2026-02-25 07:35:57 +0000 UTC" firstStartedPulling="2026-02-25 07:36:19.394630474 +0000 UTC m=+1132.756456484" lastFinishedPulling="2026-02-25 07:36:24.349953371 +0000 UTC m=+1137.711779411" observedRunningTime="2026-02-25 07:36:27.665842789 +0000 UTC m=+1141.027668819" watchObservedRunningTime="2026-02-25 07:36:27.666850125 +0000 UTC m=+1141.028676145" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.704188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fj2nl"] Feb 25 07:36:27 crc kubenswrapper[4749]: W0225 07:36:27.718034 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a6977a_847b_40c1_bc5a_f60eb7cc4f67.slice/crio-a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6 WatchSource:0}: Error finding container a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6: Status 404 returned error can't find the container with id a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6 Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.736222 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.921705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.922854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.924958 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 25 07:36:27 crc kubenswrapper[4749]: I0225 07:36:27.940441 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.010825 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jt2g-config-w5hfh"] Feb 25 07:36:28 crc kubenswrapper[4749]: W0225 07:36:28.015731 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb78740e_2625_40fc_b589_8db203f61deb.slice/crio-0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb WatchSource:0}: Error finding container 0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb: Status 404 returned error can't find the container with id 0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tw6t\" (UniqueName: \"kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.079269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.180837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tw6t\" (UniqueName: \"kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.180998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.181143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.181326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.181420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.181517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.182198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.182198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.182276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.183035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.183242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.205378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tw6t\" (UniqueName: \"kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t\") pod \"dnsmasq-dns-5c79d794d7-ldrhn\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.240954 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.492883 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:28 crc kubenswrapper[4749]: W0225 07:36:28.498637 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac1d2623_c884_4b43_919c_96637f768d50.slice/crio-0385cb365b5d22d67063224003de4dd46fc5919674ec883c95a29fe747e52e4a WatchSource:0}: Error finding container 0385cb365b5d22d67063224003de4dd46fc5919674ec883c95a29fe747e52e4a: Status 404 returned error can't find the container with id 0385cb365b5d22d67063224003de4dd46fc5919674ec883c95a29fe747e52e4a Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.641763 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" containerID="5f1371ad6e8cd5b13bf588e7e1ef5110215cf50a732ed6146ffd02bdd2fc4eec" exitCode=0 Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.641844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fj2nl" event={"ID":"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67","Type":"ContainerDied","Data":"5f1371ad6e8cd5b13bf588e7e1ef5110215cf50a732ed6146ffd02bdd2fc4eec"} Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.641875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fj2nl" event={"ID":"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67","Type":"ContainerStarted","Data":"a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6"} Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.644076 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb78740e-2625-40fc-b589-8db203f61deb" containerID="231c5baeb8aeacabac6c666088561386ea18543aef1702125aca038da0fdd3d8" exitCode=0 Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.644126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-w5hfh" event={"ID":"cb78740e-2625-40fc-b589-8db203f61deb","Type":"ContainerDied","Data":"231c5baeb8aeacabac6c666088561386ea18543aef1702125aca038da0fdd3d8"} Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.644141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-w5hfh" event={"ID":"cb78740e-2625-40fc-b589-8db203f61deb","Type":"ContainerStarted","Data":"0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb"} Feb 25 07:36:28 crc kubenswrapper[4749]: I0225 07:36:28.647847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" event={"ID":"ac1d2623-c884-4b43-919c-96637f768d50","Type":"ContainerStarted","Data":"0385cb365b5d22d67063224003de4dd46fc5919674ec883c95a29fe747e52e4a"} Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.086689 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.219968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data\") pod \"0991404c-02d3-451e-9a9a-fbd93370e965\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.220079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle\") pod \"0991404c-02d3-451e-9a9a-fbd93370e965\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.220146 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z25vw\" (UniqueName: \"kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw\") pod \"0991404c-02d3-451e-9a9a-fbd93370e965\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.220165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data\") pod \"0991404c-02d3-451e-9a9a-fbd93370e965\" (UID: \"0991404c-02d3-451e-9a9a-fbd93370e965\") " Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.226997 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw" (OuterVolumeSpecName: "kube-api-access-z25vw") pod "0991404c-02d3-451e-9a9a-fbd93370e965" (UID: "0991404c-02d3-451e-9a9a-fbd93370e965"). InnerVolumeSpecName "kube-api-access-z25vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.227221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0991404c-02d3-451e-9a9a-fbd93370e965" (UID: "0991404c-02d3-451e-9a9a-fbd93370e965"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.252929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0991404c-02d3-451e-9a9a-fbd93370e965" (UID: "0991404c-02d3-451e-9a9a-fbd93370e965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.278986 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data" (OuterVolumeSpecName: "config-data") pod "0991404c-02d3-451e-9a9a-fbd93370e965" (UID: "0991404c-02d3-451e-9a9a-fbd93370e965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.323578 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.323628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z25vw\" (UniqueName: \"kubernetes.io/projected/0991404c-02d3-451e-9a9a-fbd93370e965-kube-api-access-z25vw\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.323641 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.323650 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0991404c-02d3-451e-9a9a-fbd93370e965-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.666847 4749 generic.go:334] "Generic (PLEG): container finished" podID="ac1d2623-c884-4b43-919c-96637f768d50" containerID="118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836" exitCode=0 Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.667280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" event={"ID":"ac1d2623-c884-4b43-919c-96637f768d50","Type":"ContainerDied","Data":"118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836"} Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.672899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6lvxb" event={"ID":"0991404c-02d3-451e-9a9a-fbd93370e965","Type":"ContainerDied","Data":"a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8"} Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.672955 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a541555021b5f937e870cf9c09e685e1b1694cf0d5a72474bb9c31081495caa8" Feb 25 07:36:29 crc kubenswrapper[4749]: I0225 07:36:29.673063 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6lvxb" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.029836 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.077165 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:36:30 crc kubenswrapper[4749]: E0225 07:36:30.077468 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0991404c-02d3-451e-9a9a-fbd93370e965" containerName="glance-db-sync" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.077480 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0991404c-02d3-451e-9a9a-fbd93370e965" containerName="glance-db-sync" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.077655 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0991404c-02d3-451e-9a9a-fbd93370e965" containerName="glance-db-sync" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.078410 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.096710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.099135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.103697 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9r5\" (UniqueName: \"kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5\") pod \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run" (OuterVolumeSpecName: "var-run") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246248 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts\") pod \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\" (UID: \"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26pj6\" (UniqueName: \"kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246435 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts\") pod \"cb78740e-2625-40fc-b589-8db203f61deb\" (UID: \"cb78740e-2625-40fc-b589-8db203f61deb\") " Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.246807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xfsp\" (UniqueName: \"kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247412 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" (UID: "e5a6977a-847b-40c1-bc5a-f60eb7cc4f67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247435 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247484 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cb78740e-2625-40fc-b589-8db203f61deb-var-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.247907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts" (OuterVolumeSpecName: "scripts") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.250733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6" (OuterVolumeSpecName: "kube-api-access-26pj6") pod "cb78740e-2625-40fc-b589-8db203f61deb" (UID: "cb78740e-2625-40fc-b589-8db203f61deb"). InnerVolumeSpecName "kube-api-access-26pj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.250799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5" (OuterVolumeSpecName: "kube-api-access-2p9r5") pod "e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" (UID: "e5a6977a-847b-40c1-bc5a-f60eb7cc4f67"). InnerVolumeSpecName "kube-api-access-2p9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xfsp\" (UniqueName: \"kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349679 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349689 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26pj6\" (UniqueName: \"kubernetes.io/projected/cb78740e-2625-40fc-b589-8db203f61deb-kube-api-access-26pj6\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349701 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb78740e-2625-40fc-b589-8db203f61deb-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349710 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9r5\" (UniqueName: \"kubernetes.io/projected/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-kube-api-access-2p9r5\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.349719 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.350797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.350864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.350870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.351033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.351152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.373815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xfsp\" (UniqueName: \"kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp\") pod \"dnsmasq-dns-5f59b8f679-c8nx7\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.417572 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.681185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" event={"ID":"ac1d2623-c884-4b43-919c-96637f768d50","Type":"ContainerStarted","Data":"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f"} Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.681649 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.683247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fj2nl" event={"ID":"e5a6977a-847b-40c1-bc5a-f60eb7cc4f67","Type":"ContainerDied","Data":"a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6"} Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.683285 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a346aa37379488a06b25649c182bb1e3e0dcfd68458e50043e4cdd01e4051ef6" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.683331 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fj2nl" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.684739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jt2g-config-w5hfh" event={"ID":"cb78740e-2625-40fc-b589-8db203f61deb","Type":"ContainerDied","Data":"0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb"} Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.684823 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0893a113ae4e9efff0a1abe6e4be30d7c4ea98dc162fc3c878b03801f29a6cbb" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.684919 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jt2g-config-w5hfh" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.706115 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" podStartSLOduration=3.706099257 podStartE2EDuration="3.706099257s" podCreationTimestamp="2026-02-25 07:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:30.70380755 +0000 UTC m=+1144.065633570" watchObservedRunningTime="2026-02-25 07:36:30.706099257 +0000 UTC m=+1144.067925267" Feb 25 07:36:30 crc kubenswrapper[4749]: I0225 07:36:30.889995 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.216034 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9jt2g-config-w5hfh"] Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.220126 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9jt2g-config-w5hfh"] Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.331139 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb78740e-2625-40fc-b589-8db203f61deb" path="/var/lib/kubelet/pods/cb78740e-2625-40fc-b589-8db203f61deb/volumes" Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.634752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.739019 4749 generic.go:334] "Generic (PLEG): container finished" podID="62e879c3-255b-4499-b388-41ffdaf75f79" containerID="38f9bd58fefa5938b6c403a4ae23da4286550ee0f6e4f3029426b23808456da1" exitCode=0 Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.739228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" event={"ID":"62e879c3-255b-4499-b388-41ffdaf75f79","Type":"ContainerDied","Data":"38f9bd58fefa5938b6c403a4ae23da4286550ee0f6e4f3029426b23808456da1"} Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.739446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" event={"ID":"62e879c3-255b-4499-b388-41ffdaf75f79","Type":"ContainerStarted","Data":"d265883baf82db4f707b1ef051b99b874fbdcca788021cfb7f8f8675c7286eae"} Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.740481 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="dnsmasq-dns" containerID="cri-o://89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f" gracePeriod=10 Feb 25 07:36:31 crc kubenswrapper[4749]: I0225 07:36:31.952508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.025914 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5mn2f"] Feb 25 07:36:32 crc kubenswrapper[4749]: E0225 07:36:32.026230 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb78740e-2625-40fc-b589-8db203f61deb" containerName="ovn-config" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.026241 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb78740e-2625-40fc-b589-8db203f61deb" containerName="ovn-config" Feb 25 07:36:32 crc kubenswrapper[4749]: E0225 07:36:32.026492 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" containerName="mariadb-account-create-update" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.026500 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" containerName="mariadb-account-create-update" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.026681 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb78740e-2625-40fc-b589-8db203f61deb" containerName="ovn-config" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.026701 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" containerName="mariadb-account-create-update" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.027159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.058180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5mn2f"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.178804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.178951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmcl\" (UniqueName: \"kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.280603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.280914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmcl\" (UniqueName: \"kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.299884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.312211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmcl\" (UniqueName: \"kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl\") pod \"cinder-db-create-5mn2f\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.338849 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.339461 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rdvkc"] Feb 25 07:36:32 crc kubenswrapper[4749]: E0225 07:36:32.344794 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="init" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.344820 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="init" Feb 25 07:36:32 crc kubenswrapper[4749]: E0225 07:36:32.344849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="dnsmasq-dns" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.344855 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="dnsmasq-dns" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.345076 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1d2623-c884-4b43-919c-96637f768d50" containerName="dnsmasq-dns" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.345562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.348993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sr7tv" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.349240 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.349364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.356099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.364362 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-634a-account-create-update-zm2md"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.365631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.369774 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.378928 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.381944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tw6t\" (UniqueName: \"kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382098 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0\") pod \"ac1d2623-c884-4b43-919c-96637f768d50\" (UID: \"ac1d2623-c884-4b43-919c-96637f768d50\") " Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.382549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr72p\" (UniqueName: \"kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.407864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t" (OuterVolumeSpecName: "kube-api-access-6tw6t") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "kube-api-access-6tw6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.409033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rdvkc"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.443140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.456803 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-634a-account-create-update-zm2md"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.480667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config" (OuterVolumeSpecName: "config") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2cd\" (UniqueName: \"kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr72p\" (UniqueName: \"kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484695 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tw6t\" (UniqueName: \"kubernetes.io/projected/ac1d2623-c884-4b43-919c-96637f768d50-kube-api-access-6tw6t\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.484714 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.489255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.489829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.504118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.524072 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8318-account-create-update-jmlls"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.525757 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.526551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr72p\" (UniqueName: \"kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p\") pod \"keystone-db-sync-rdvkc\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.527980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.531518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.580196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac1d2623-c884-4b43-919c-96637f768d50" (UID: "ac1d2623-c884-4b43-919c-96637f768d50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.582070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8318-account-create-update-jmlls"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2cd\" (UniqueName: \"kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bkww\" (UniqueName: \"kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587826 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587842 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.587851 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1d2623-c884-4b43-919c-96637f768d50-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.588481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.598674 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wwlbd"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.599751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.613967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wwlbd"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.627723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2cd\" (UniqueName: \"kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd\") pod \"cinder-634a-account-create-update-zm2md\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.634093 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rdz5t"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.635129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.642873 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b75e-account-create-update-ln7b8"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.643981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.646928 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.647858 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b75e-account-create-update-ln7b8"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.666626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rdz5t"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.667785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.684181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bkww\" (UniqueName: \"kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689272 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcg5\" (UniqueName: \"kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb5zp\" (UniqueName: \"kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlb9\" (UniqueName: \"kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.689425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.692688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.760956 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5mn2f"] Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.770559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bkww\" (UniqueName: \"kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww\") pod \"barbican-8318-account-create-update-jmlls\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.771884 4749 generic.go:334] "Generic (PLEG): container finished" podID="ac1d2623-c884-4b43-919c-96637f768d50" containerID="89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f" exitCode=0 Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.771969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" event={"ID":"ac1d2623-c884-4b43-919c-96637f768d50","Type":"ContainerDied","Data":"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f"} Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.772009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" event={"ID":"ac1d2623-c884-4b43-919c-96637f768d50","Type":"ContainerDied","Data":"0385cb365b5d22d67063224003de4dd46fc5919674ec883c95a29fe747e52e4a"} Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.772031 4749 scope.go:117] "RemoveContainer" containerID="89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.772213 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-ldrhn" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.792951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb5zp\" (UniqueName: \"kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.793021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlb9\" (UniqueName: \"kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.793071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.793146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.793168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.794007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.794043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.794121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcg5\" (UniqueName: \"kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.794126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.794924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" event={"ID":"62e879c3-255b-4499-b388-41ffdaf75f79","Type":"ContainerStarted","Data":"6efdfd678cbee90bbe1332bfd07b2655ac1cb17e027801ccf8e38d1717e0b54c"} Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.795666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.803355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5mn2f" event={"ID":"adbb415a-14d1-48e4-8baf-de714957de54","Type":"ContainerStarted","Data":"c73be43a92d9ab060e258e5d9e4d8850df9e3eb06754f0a39a9784e50091d2dd"} Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.817730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcg5\" (UniqueName: \"kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5\") pod \"barbican-db-create-wwlbd\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.835571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb5zp\" (UniqueName: \"kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp\") pod \"neutron-b75e-account-create-update-ln7b8\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.839517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlb9\" (UniqueName: \"kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9\") pod \"neutron-db-create-rdz5t\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.865292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.924027 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.967035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:32 crc kubenswrapper[4749]: I0225 07:36:32.970749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.047425 4749 scope.go:117] "RemoveContainer" containerID="118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.094872 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" podStartSLOduration=3.094847754 podStartE2EDuration="3.094847754s" podCreationTimestamp="2026-02-25 07:36:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:32.828127685 +0000 UTC m=+1146.189953705" watchObservedRunningTime="2026-02-25 07:36:33.094847754 +0000 UTC m=+1146.456673774" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.097670 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.125765 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-ldrhn"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.147504 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rdvkc"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.161841 4749 scope.go:117] "RemoveContainer" containerID="89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f" Feb 25 07:36:33 crc kubenswrapper[4749]: E0225 07:36:33.162510 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f\": container with ID starting with 89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f not found: ID does not exist" containerID="89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.162545 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f"} err="failed to get container status \"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f\": rpc error: code = NotFound desc = could not find container \"89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f\": container with ID starting with 89413d991b55e710631c1c8707b3bbaa301843c45f9942e7fb7639fabf7ab01f not found: ID does not exist" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.162572 4749 scope.go:117] "RemoveContainer" containerID="118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836" Feb 25 07:36:33 crc kubenswrapper[4749]: E0225 07:36:33.165079 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836\": container with ID starting with 118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836 not found: ID does not exist" containerID="118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.165118 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836"} err="failed to get container status \"118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836\": rpc error: code = NotFound desc = could not find container \"118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836\": container with ID starting with 118e1acf635e87d5695349aa50993f74214eef90d739fc1cc6ad53579d14a836 not found: ID does not exist" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.210780 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.255100 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-634a-account-create-update-zm2md"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.362121 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1d2623-c884-4b43-919c-96637f768d50" path="/var/lib/kubelet/pods/ac1d2623-c884-4b43-919c-96637f768d50/volumes" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.369258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8318-account-create-update-jmlls"] Feb 25 07:36:33 crc kubenswrapper[4749]: E0225 07:36:33.392903 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac1d2623_c884_4b43_919c_96637f768d50.slice\": RecentStats: unable to find data in memory cache]" Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.453936 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wwlbd"] Feb 25 07:36:33 crc kubenswrapper[4749]: W0225 07:36:33.467309 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29504d02_065f_4682_aaab_4b0057c0a3e5.slice/crio-ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a WatchSource:0}: Error finding container ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a: Status 404 returned error can't find the container with id ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.623493 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fj2nl"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.634158 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fj2nl"] Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.672246 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b75e-account-create-update-ln7b8"] Feb 25 07:36:33 crc kubenswrapper[4749]: W0225 07:36:33.676256 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6a33493_4015_4e30_b004_3e3e501bef55.slice/crio-2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b WatchSource:0}: Error finding container 2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b: Status 404 returned error can't find the container with id 2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.781206 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rdz5t"] Feb 25 07:36:33 crc kubenswrapper[4749]: W0225 07:36:33.784803 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852f5f2e_07e1_4fcf_8c5d_fb17ea3c8a10.slice/crio-82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9 WatchSource:0}: Error finding container 82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9: Status 404 returned error can't find the container with id 82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9 Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.811520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-634a-account-create-update-zm2md" event={"ID":"a45bc1da-cff5-4bb0-884e-439a3cc64f38","Type":"ContainerStarted","Data":"8097df3d4aff98553ed69558b09d554bac2db512853630f346bdbf33b633ee68"} Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.812587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wwlbd" event={"ID":"29504d02-065f-4682-aaab-4b0057c0a3e5","Type":"ContainerStarted","Data":"ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a"} Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.813638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b75e-account-create-update-ln7b8" event={"ID":"e6a33493-4015-4e30-b004-3e3e501bef55","Type":"ContainerStarted","Data":"2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b"} Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.816484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rdvkc" event={"ID":"e3f34460-f59e-4a71-82f0-4486398bd903","Type":"ContainerStarted","Data":"0aded9084756e8d87352e48c0152d481d54a3309d75f7ca804a857f9c2de1ed1"} Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.817726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rdz5t" event={"ID":"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10","Type":"ContainerStarted","Data":"82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9"} Feb 25 07:36:33 crc kubenswrapper[4749]: I0225 07:36:33.818930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8318-account-create-update-jmlls" event={"ID":"5421bf98-9a20-489d-95b6-e96102c963da","Type":"ContainerStarted","Data":"5a89e327b0e78a7a32b661a6fa745d104b5089d63e8db88f86060d4f4e1555ed"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.333239 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a6977a-847b-40c1-bc5a-f60eb7cc4f67" path="/var/lib/kubelet/pods/e5a6977a-847b-40c1-bc5a-f60eb7cc4f67/volumes" Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.839902 4749 generic.go:334] "Generic (PLEG): container finished" podID="5421bf98-9a20-489d-95b6-e96102c963da" containerID="9d0ad1a599539c7513fccb1cd55f646673985ea392c8409d1cd0dd7980e2b67d" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.839981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8318-account-create-update-jmlls" event={"ID":"5421bf98-9a20-489d-95b6-e96102c963da","Type":"ContainerDied","Data":"9d0ad1a599539c7513fccb1cd55f646673985ea392c8409d1cd0dd7980e2b67d"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.843058 4749 generic.go:334] "Generic (PLEG): container finished" podID="29504d02-065f-4682-aaab-4b0057c0a3e5" containerID="98d3c7179d19382db3c6a0297499b7639ea64490e3ca51747ed87508c431a4d7" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.843143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wwlbd" event={"ID":"29504d02-065f-4682-aaab-4b0057c0a3e5","Type":"ContainerDied","Data":"98d3c7179d19382db3c6a0297499b7639ea64490e3ca51747ed87508c431a4d7"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.844351 4749 generic.go:334] "Generic (PLEG): container finished" podID="a45bc1da-cff5-4bb0-884e-439a3cc64f38" containerID="5e07cf856dbe11b31d5c2fc33679ebeb80a608397f46b96ee63e5cb128881d09" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.844400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-634a-account-create-update-zm2md" event={"ID":"a45bc1da-cff5-4bb0-884e-439a3cc64f38","Type":"ContainerDied","Data":"5e07cf856dbe11b31d5c2fc33679ebeb80a608397f46b96ee63e5cb128881d09"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.846479 4749 generic.go:334] "Generic (PLEG): container finished" podID="e6a33493-4015-4e30-b004-3e3e501bef55" containerID="eb455f7390988a1bcd4d5bbfe34f53f6489a85308d1f76b5ec915caf9b9e456a" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.846578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b75e-account-create-update-ln7b8" event={"ID":"e6a33493-4015-4e30-b004-3e3e501bef55","Type":"ContainerDied","Data":"eb455f7390988a1bcd4d5bbfe34f53f6489a85308d1f76b5ec915caf9b9e456a"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.848467 4749 generic.go:334] "Generic (PLEG): container finished" podID="852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" containerID="93c2d0bdb0f83d87cdf0667f833c88bd248b781592b05af3b547451c87a78998" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.848542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rdz5t" event={"ID":"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10","Type":"ContainerDied","Data":"93c2d0bdb0f83d87cdf0667f833c88bd248b781592b05af3b547451c87a78998"} Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.850617 4749 generic.go:334] "Generic (PLEG): container finished" podID="adbb415a-14d1-48e4-8baf-de714957de54" containerID="e6c0eee1f85c6edcdea8c9c6fd9ffc79f0186e1dcca9e7e603dbebcadb03e580" exitCode=0 Feb 25 07:36:35 crc kubenswrapper[4749]: I0225 07:36:35.850652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5mn2f" event={"ID":"adbb415a-14d1-48e4-8baf-de714957de54","Type":"ContainerDied","Data":"e6c0eee1f85c6edcdea8c9c6fd9ffc79f0186e1dcca9e7e603dbebcadb03e580"} Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.611564 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-klc58"] Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.614146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.617038 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.639213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-klc58"] Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.707774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.707986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68m4\" (UniqueName: \"kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.808741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h68m4\" (UniqueName: \"kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.808802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.809576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.844091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68m4\" (UniqueName: \"kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4\") pod \"root-account-create-update-klc58\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " pod="openstack/root-account-create-update-klc58" Feb 25 07:36:38 crc kubenswrapper[4749]: I0225 07:36:38.942586 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-klc58" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.590171 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.594892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.615055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.678449 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.699717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqmcl\" (UniqueName: \"kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl\") pod \"adbb415a-14d1-48e4-8baf-de714957de54\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlb9\" (UniqueName: \"kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9\") pod \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r2cd\" (UniqueName: \"kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd\") pod \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts\") pod \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\" (UID: \"a45bc1da-cff5-4bb0-884e-439a3cc64f38\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts\") pod \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\" (UID: \"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.726616 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts\") pod \"adbb415a-14d1-48e4-8baf-de714957de54\" (UID: \"adbb415a-14d1-48e4-8baf-de714957de54\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.728146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a45bc1da-cff5-4bb0-884e-439a3cc64f38" (UID: "a45bc1da-cff5-4bb0-884e-439a3cc64f38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.728278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" (UID: "852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.728351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adbb415a-14d1-48e4-8baf-de714957de54" (UID: "adbb415a-14d1-48e4-8baf-de714957de54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.730300 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.733302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd" (OuterVolumeSpecName: "kube-api-access-8r2cd") pod "a45bc1da-cff5-4bb0-884e-439a3cc64f38" (UID: "a45bc1da-cff5-4bb0-884e-439a3cc64f38"). InnerVolumeSpecName "kube-api-access-8r2cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.733756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9" (OuterVolumeSpecName: "kube-api-access-ndlb9") pod "852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" (UID: "852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10"). InnerVolumeSpecName "kube-api-access-ndlb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.739687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl" (OuterVolumeSpecName: "kube-api-access-fqmcl") pod "adbb415a-14d1-48e4-8baf-de714957de54" (UID: "adbb415a-14d1-48e4-8baf-de714957de54"). InnerVolumeSpecName "kube-api-access-fqmcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bkww\" (UniqueName: \"kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww\") pod \"5421bf98-9a20-489d-95b6-e96102c963da\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb5zp\" (UniqueName: \"kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp\") pod \"e6a33493-4015-4e30-b004-3e3e501bef55\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts\") pod \"e6a33493-4015-4e30-b004-3e3e501bef55\" (UID: \"e6a33493-4015-4e30-b004-3e3e501bef55\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828514 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts\") pod \"5421bf98-9a20-489d-95b6-e96102c963da\" (UID: \"5421bf98-9a20-489d-95b6-e96102c963da\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828810 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45bc1da-cff5-4bb0-884e-439a3cc64f38-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828822 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828831 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adbb415a-14d1-48e4-8baf-de714957de54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828840 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqmcl\" (UniqueName: \"kubernetes.io/projected/adbb415a-14d1-48e4-8baf-de714957de54-kube-api-access-fqmcl\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828849 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlb9\" (UniqueName: \"kubernetes.io/projected/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10-kube-api-access-ndlb9\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.828859 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r2cd\" (UniqueName: \"kubernetes.io/projected/a45bc1da-cff5-4bb0-884e-439a3cc64f38-kube-api-access-8r2cd\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.829109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5421bf98-9a20-489d-95b6-e96102c963da" (UID: "5421bf98-9a20-489d-95b6-e96102c963da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.829315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6a33493-4015-4e30-b004-3e3e501bef55" (UID: "e6a33493-4015-4e30-b004-3e3e501bef55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.833219 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww" (OuterVolumeSpecName: "kube-api-access-8bkww") pod "5421bf98-9a20-489d-95b6-e96102c963da" (UID: "5421bf98-9a20-489d-95b6-e96102c963da"). InnerVolumeSpecName "kube-api-access-8bkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.835230 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp" (OuterVolumeSpecName: "kube-api-access-kb5zp") pod "e6a33493-4015-4e30-b004-3e3e501bef55" (UID: "e6a33493-4015-4e30-b004-3e3e501bef55"). InnerVolumeSpecName "kube-api-access-kb5zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.909040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wwlbd" event={"ID":"29504d02-065f-4682-aaab-4b0057c0a3e5","Type":"ContainerDied","Data":"ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.909402 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec70d5915eb0435c6c053bb1c3cfeb5d40166e41167770ddc4296c654d60014a" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.909145 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wwlbd" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.911111 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-634a-account-create-update-zm2md" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.911112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-634a-account-create-update-zm2md" event={"ID":"a45bc1da-cff5-4bb0-884e-439a3cc64f38","Type":"ContainerDied","Data":"8097df3d4aff98553ed69558b09d554bac2db512853630f346bdbf33b633ee68"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.911570 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8097df3d4aff98553ed69558b09d554bac2db512853630f346bdbf33b633ee68" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.913390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b75e-account-create-update-ln7b8" event={"ID":"e6a33493-4015-4e30-b004-3e3e501bef55","Type":"ContainerDied","Data":"2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.913446 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa3bd2341f2cb22ae2e75a91dff8dd608a3aa04db8ac50526a9a968b8c1273b" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.913540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b75e-account-create-update-ln7b8" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.925248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rdvkc" event={"ID":"e3f34460-f59e-4a71-82f0-4486398bd903","Type":"ContainerStarted","Data":"f78259553aa77bb6c03a624be52623826a73c94da505b7aac080dfa9b2811465"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.930747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rdz5t" event={"ID":"852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10","Type":"ContainerDied","Data":"82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.930923 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ff03c521ecb19768903acbd776e1c7c4e8c817eca752f81a51df61c99434c9" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.930808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts\") pod \"29504d02-065f-4682-aaab-4b0057c0a3e5\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.931330 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rdz5t" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.931668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcg5\" (UniqueName: \"kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5\") pod \"29504d02-065f-4682-aaab-4b0057c0a3e5\" (UID: \"29504d02-065f-4682-aaab-4b0057c0a3e5\") " Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.932868 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bkww\" (UniqueName: \"kubernetes.io/projected/5421bf98-9a20-489d-95b6-e96102c963da-kube-api-access-8bkww\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.932916 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb5zp\" (UniqueName: \"kubernetes.io/projected/e6a33493-4015-4e30-b004-3e3e501bef55-kube-api-access-kb5zp\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.932938 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6a33493-4015-4e30-b004-3e3e501bef55-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.932959 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5421bf98-9a20-489d-95b6-e96102c963da-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.934553 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29504d02-065f-4682-aaab-4b0057c0a3e5" (UID: "29504d02-065f-4682-aaab-4b0057c0a3e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.944691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5" (OuterVolumeSpecName: "kube-api-access-frcg5") pod "29504d02-065f-4682-aaab-4b0057c0a3e5" (UID: "29504d02-065f-4682-aaab-4b0057c0a3e5"). InnerVolumeSpecName "kube-api-access-frcg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.945047 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5mn2f" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.945093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5mn2f" event={"ID":"adbb415a-14d1-48e4-8baf-de714957de54","Type":"ContainerDied","Data":"c73be43a92d9ab060e258e5d9e4d8850df9e3eb06754f0a39a9784e50091d2dd"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.945144 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c73be43a92d9ab060e258e5d9e4d8850df9e3eb06754f0a39a9784e50091d2dd" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.948092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8318-account-create-update-jmlls" event={"ID":"5421bf98-9a20-489d-95b6-e96102c963da","Type":"ContainerDied","Data":"5a89e327b0e78a7a32b661a6fa745d104b5089d63e8db88f86060d4f4e1555ed"} Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.948143 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a89e327b0e78a7a32b661a6fa745d104b5089d63e8db88f86060d4f4e1555ed" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.948221 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8318-account-create-update-jmlls" Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.958557 4749 scope.go:117] "RemoveContainer" containerID="a9f1ec3358e14af7d5980d4c073cac917011a29d8978a4ff49c60255e1f6cfe6" Feb 25 07:36:39 crc kubenswrapper[4749]: W0225 07:36:39.966432 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe32b6c_b26a_4960_9b88_db584d3c56bf.slice/crio-a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629 WatchSource:0}: Error finding container a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629: Status 404 returned error can't find the container with id a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629 Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.967097 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-klc58"] Feb 25 07:36:39 crc kubenswrapper[4749]: I0225 07:36:39.972376 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rdvkc" podStartSLOduration=1.7081573030000001 podStartE2EDuration="7.972356347s" podCreationTimestamp="2026-02-25 07:36:32 +0000 UTC" firstStartedPulling="2026-02-25 07:36:33.210567139 +0000 UTC m=+1146.572393159" lastFinishedPulling="2026-02-25 07:36:39.474766173 +0000 UTC m=+1152.836592203" observedRunningTime="2026-02-25 07:36:39.957705662 +0000 UTC m=+1153.319531692" watchObservedRunningTime="2026-02-25 07:36:39.972356347 +0000 UTC m=+1153.334182377" Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.036083 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcg5\" (UniqueName: \"kubernetes.io/projected/29504d02-065f-4682-aaab-4b0057c0a3e5-kube-api-access-frcg5\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.036504 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29504d02-065f-4682-aaab-4b0057c0a3e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.419938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.536192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.536428 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="dnsmasq-dns" containerID="cri-o://233fe70f6ad5d3572d4afc1feaeef24255f05ce045d2cbabbf5c417bd040eddd" gracePeriod=10 Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.958916 4749 generic.go:334] "Generic (PLEG): container finished" podID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerID="233fe70f6ad5d3572d4afc1feaeef24255f05ce045d2cbabbf5c417bd040eddd" exitCode=0 Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.959021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" event={"ID":"2f2f496a-5971-447e-887f-9d3c5e70f9ea","Type":"ContainerDied","Data":"233fe70f6ad5d3572d4afc1feaeef24255f05ce045d2cbabbf5c417bd040eddd"} Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.959084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" event={"ID":"2f2f496a-5971-447e-887f-9d3c5e70f9ea","Type":"ContainerDied","Data":"1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa"} Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.959098 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bc3c3f41d5c7137c6295243fff696a952bce388b22c65fb0da5daa0f92403aa" Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.962317 4749 generic.go:334] "Generic (PLEG): container finished" podID="bbe32b6c-b26a-4960-9b88-db584d3c56bf" containerID="fd14d92db0185b6c0743ec6251b41f313737ee589de39c0a2b2a54c801c571dc" exitCode=0 Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.962367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-klc58" event={"ID":"bbe32b6c-b26a-4960-9b88-db584d3c56bf","Type":"ContainerDied","Data":"fd14d92db0185b6c0743ec6251b41f313737ee589de39c0a2b2a54c801c571dc"} Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.962427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-klc58" event={"ID":"bbe32b6c-b26a-4960-9b88-db584d3c56bf","Type":"ContainerStarted","Data":"a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629"} Feb 25 07:36:40 crc kubenswrapper[4749]: I0225 07:36:40.996419 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.054029 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgmgw\" (UniqueName: \"kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw\") pod \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.054088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc\") pod \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.054166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb\") pod \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.054197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb\") pod \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.054230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config\") pod \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\" (UID: \"2f2f496a-5971-447e-887f-9d3c5e70f9ea\") " Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.069049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw" (OuterVolumeSpecName: "kube-api-access-cgmgw") pod "2f2f496a-5971-447e-887f-9d3c5e70f9ea" (UID: "2f2f496a-5971-447e-887f-9d3c5e70f9ea"). InnerVolumeSpecName "kube-api-access-cgmgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.103950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config" (OuterVolumeSpecName: "config") pod "2f2f496a-5971-447e-887f-9d3c5e70f9ea" (UID: "2f2f496a-5971-447e-887f-9d3c5e70f9ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.104972 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f2f496a-5971-447e-887f-9d3c5e70f9ea" (UID: "2f2f496a-5971-447e-887f-9d3c5e70f9ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.117985 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f2f496a-5971-447e-887f-9d3c5e70f9ea" (UID: "2f2f496a-5971-447e-887f-9d3c5e70f9ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.128796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f2f496a-5971-447e-887f-9d3c5e70f9ea" (UID: "2f2f496a-5971-447e-887f-9d3c5e70f9ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.155718 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.155746 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.155756 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.155766 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgmgw\" (UniqueName: \"kubernetes.io/projected/2f2f496a-5971-447e-887f-9d3c5e70f9ea-kube-api-access-cgmgw\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.155776 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f2f496a-5971-447e-887f-9d3c5e70f9ea-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:41 crc kubenswrapper[4749]: I0225 07:36:41.973186 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-nzpgj" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.014852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.028514 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-nzpgj"] Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.396222 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-klc58" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.598856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts\") pod \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.599036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h68m4\" (UniqueName: \"kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4\") pod \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\" (UID: \"bbe32b6c-b26a-4960-9b88-db584d3c56bf\") " Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.599676 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbe32b6c-b26a-4960-9b88-db584d3c56bf" (UID: "bbe32b6c-b26a-4960-9b88-db584d3c56bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.605797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4" (OuterVolumeSpecName: "kube-api-access-h68m4") pod "bbe32b6c-b26a-4960-9b88-db584d3c56bf" (UID: "bbe32b6c-b26a-4960-9b88-db584d3c56bf"). InnerVolumeSpecName "kube-api-access-h68m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.701307 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe32b6c-b26a-4960-9b88-db584d3c56bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.701342 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h68m4\" (UniqueName: \"kubernetes.io/projected/bbe32b6c-b26a-4960-9b88-db584d3c56bf-kube-api-access-h68m4\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.982913 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3f34460-f59e-4a71-82f0-4486398bd903" containerID="f78259553aa77bb6c03a624be52623826a73c94da505b7aac080dfa9b2811465" exitCode=0 Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.983005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rdvkc" event={"ID":"e3f34460-f59e-4a71-82f0-4486398bd903","Type":"ContainerDied","Data":"f78259553aa77bb6c03a624be52623826a73c94da505b7aac080dfa9b2811465"} Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.985232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-klc58" event={"ID":"bbe32b6c-b26a-4960-9b88-db584d3c56bf","Type":"ContainerDied","Data":"a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629"} Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.985286 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13309baf77df819d28bed1eb7724eb417e36e56809f2436ad495d8c51cb9629" Feb 25 07:36:42 crc kubenswrapper[4749]: I0225 07:36:42.985357 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-klc58" Feb 25 07:36:43 crc kubenswrapper[4749]: I0225 07:36:43.335248 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" path="/var/lib/kubelet/pods/2f2f496a-5971-447e-887f-9d3c5e70f9ea/volumes" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.371981 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.446114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr72p\" (UniqueName: \"kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p\") pod \"e3f34460-f59e-4a71-82f0-4486398bd903\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.446170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle\") pod \"e3f34460-f59e-4a71-82f0-4486398bd903\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.446294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data\") pod \"e3f34460-f59e-4a71-82f0-4486398bd903\" (UID: \"e3f34460-f59e-4a71-82f0-4486398bd903\") " Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.450692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p" (OuterVolumeSpecName: "kube-api-access-hr72p") pod "e3f34460-f59e-4a71-82f0-4486398bd903" (UID: "e3f34460-f59e-4a71-82f0-4486398bd903"). InnerVolumeSpecName "kube-api-access-hr72p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.466478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3f34460-f59e-4a71-82f0-4486398bd903" (UID: "e3f34460-f59e-4a71-82f0-4486398bd903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.483671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data" (OuterVolumeSpecName: "config-data") pod "e3f34460-f59e-4a71-82f0-4486398bd903" (UID: "e3f34460-f59e-4a71-82f0-4486398bd903"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.547577 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.547820 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr72p\" (UniqueName: \"kubernetes.io/projected/e3f34460-f59e-4a71-82f0-4486398bd903-kube-api-access-hr72p\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:44 crc kubenswrapper[4749]: I0225 07:36:44.547830 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f34460-f59e-4a71-82f0-4486398bd903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.009955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rdvkc" event={"ID":"e3f34460-f59e-4a71-82f0-4486398bd903","Type":"ContainerDied","Data":"0aded9084756e8d87352e48c0152d481d54a3309d75f7ca804a857f9c2de1ed1"} Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.010016 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aded9084756e8d87352e48c0152d481d54a3309d75f7ca804a857f9c2de1ed1" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.010097 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rdvkc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.304675 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305140 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45bc1da-cff5-4bb0-884e-439a3cc64f38" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305164 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45bc1da-cff5-4bb0-884e-439a3cc64f38" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a33493-4015-4e30-b004-3e3e501bef55" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305198 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a33493-4015-4e30-b004-3e3e501bef55" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305220 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29504d02-065f-4682-aaab-4b0057c0a3e5" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305229 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29504d02-065f-4682-aaab-4b0057c0a3e5" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305241 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f34460-f59e-4a71-82f0-4486398bd903" containerName="keystone-db-sync" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305248 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f34460-f59e-4a71-82f0-4486398bd903" containerName="keystone-db-sync" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305258 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="dnsmasq-dns" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305265 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="dnsmasq-dns" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305277 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe32b6c-b26a-4960-9b88-db584d3c56bf" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305288 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe32b6c-b26a-4960-9b88-db584d3c56bf" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305302 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbb415a-14d1-48e4-8baf-de714957de54" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305309 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbb415a-14d1-48e4-8baf-de714957de54" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305322 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="init" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305329 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="init" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305346 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5421bf98-9a20-489d-95b6-e96102c963da" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305354 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5421bf98-9a20-489d-95b6-e96102c963da" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: E0225 07:36:45.305367 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305374 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305550 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbb415a-14d1-48e4-8baf-de714957de54" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305570 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2f496a-5971-447e-887f-9d3c5e70f9ea" containerName="dnsmasq-dns" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305580 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe32b6c-b26a-4960-9b88-db584d3c56bf" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305612 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29504d02-065f-4682-aaab-4b0057c0a3e5" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305625 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a33493-4015-4e30-b004-3e3e501bef55" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305636 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45bc1da-cff5-4bb0-884e-439a3cc64f38" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305646 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" containerName="mariadb-database-create" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305657 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5421bf98-9a20-489d-95b6-e96102c963da" containerName="mariadb-account-create-update" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.305679 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f34460-f59e-4a71-82f0-4486398bd903" containerName="keystone-db-sync" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.306932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.312248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rsq5p"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.313380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.314733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.317580 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.317865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.318045 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.318532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sr7tv" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.333919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwgd\" (UniqueName: \"kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.361601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mw8s\" (UniqueName: \"kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.369462 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsq5p"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.448358 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.457641 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.461583 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwgd\" (UniqueName: \"kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mw8s\" (UniqueName: \"kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.462874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d894\" (UniqueName: \"kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.463447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.463515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.463723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.463933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.464576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.466010 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.466190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.466303 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-brh4t" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.484993 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.490637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.490887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.497062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.497441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.497834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwgd\" (UniqueName: \"kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.547672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys\") pod \"keystone-bootstrap-rsq5p\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.548402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mw8s\" (UniqueName: \"kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s\") pod \"dnsmasq-dns-bbf5cc879-ljf5b\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d894\" (UniqueName: \"kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.568950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.569428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.570383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.582137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.615374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d894\" (UniqueName: \"kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894\") pod \"horizon-788cf5dc45-snhxc\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.670100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.670550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.678848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.679883 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.688768 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.707734 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.707914 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.720518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.747991 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lth6k"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.749858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.755808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.756098 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.756202 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lpz5m" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.779333 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lth6k"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.788056 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-lv8wj"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.789152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.798931 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.799103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tdmhl" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.807156 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.807372 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.842779 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lv8wj"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.850530 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hv26f"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.851790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.854195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.854378 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-852tl" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.854900 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.866098 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hv26f"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875658 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbhb\" (UniqueName: \"kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.875807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.876105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.876282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.876515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmwn\" (UniqueName: \"kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.876643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.884489 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.885743 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.909664 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.916369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.925778 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xld25"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.928489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.939366 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.939533 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59plz" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.942554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.963557 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.965133 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.970457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.970657 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.970756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wjttw" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.970901 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.978677 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngltf\" (UniqueName: \"kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zmw\" (UniqueName: \"kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbhb\" (UniqueName: \"kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmwn\" (UniqueName: \"kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.979895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:45 crc kubenswrapper[4749]: I0225 07:36:45.980881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:45.992475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:45.993001 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:45.993196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.003019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xld25"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.015098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmwn\" (UniqueName: \"kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.015645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.015858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.023456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.027050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts\") pod \"ceilometer-0\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.032970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbhb\" (UniqueName: \"kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.038972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.049193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.051274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.052246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.065456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data\") pod \"cinder-db-sync-lth6k\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.065518 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.066783 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.081346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.081529 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.081780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zmw\" (UniqueName: \"kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8j4j\" (UniqueName: \"kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082946 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.082965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.083026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.083877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lth6k" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.084484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.084505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vsjg\" (UniqueName: \"kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.084784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.085380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxq2\" (UniqueName: \"kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.085436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.085545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.087477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngltf\" (UniqueName: \"kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngrm\" (UniqueName: \"kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.100898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.110188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.110656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.111182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zmw\" (UniqueName: \"kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.118300 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.122216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle\") pod \"placement-db-sync-hv26f\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.125205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngltf\" (UniqueName: \"kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.155397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle\") pod \"neutron-db-sync-lv8wj\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxq2\" (UniqueName: \"kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngrm\" (UniqueName: \"kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206841 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kprx\" (UniqueName: \"kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.206992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207021 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8j4j\" (UniqueName: \"kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.207536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vsjg\" (UniqueName: \"kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.208140 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.222825 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.223736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.224364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.224867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.225071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.227435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.228707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.229115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.229644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.229796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.230115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv26f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.231685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.232505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.235189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vsjg\" (UniqueName: \"kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.235488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxq2\" (UniqueName: \"kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.235837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.236153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.236581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.245240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key\") pod \"horizon-bbbdf4ccc-w9pws\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.250087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.265027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.276235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngrm\" (UniqueName: \"kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm\") pod \"barbican-db-sync-xld25\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.282110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8j4j\" (UniqueName: \"kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j\") pod \"dnsmasq-dns-56df8fb6b7-pf87f\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kprx\" (UniqueName: \"kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308985 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.315229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.308835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.316888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.317007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.317100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.324086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.336721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xld25" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.343111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.349726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kprx\" (UniqueName: \"kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.351435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.352532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.359315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.366790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.368155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.422124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.447809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.562085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.579178 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.593620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.763827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lth6k"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.775372 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsq5p"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.797363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:36:46 crc kubenswrapper[4749]: I0225 07:36:46.822330 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hv26f"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.027608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xld25"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.035214 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.045190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsq5p" event={"ID":"4bc73d94-4436-43a8-963b-50efb29c9f8c","Type":"ContainerStarted","Data":"daf1229746c30846ae1262c9edc3c5358455d6adaedb0192eca6cb9a11e5168e"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.045246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsq5p" event={"ID":"4bc73d94-4436-43a8-963b-50efb29c9f8c","Type":"ContainerStarted","Data":"aacfe209e8d08dae2fbba222dc8425308a6a2d71c02f4db596e300c3455c4f7d"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.050926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv26f" event={"ID":"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff","Type":"ContainerStarted","Data":"cd39873d8c0ce54ba1a1f51ee672faf6e825c8cd4e7b4e966049dfacd39f1bd5"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.052619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-788cf5dc45-snhxc" event={"ID":"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a","Type":"ContainerStarted","Data":"bcec80b9b3ad29c2203ff98a9310375f234d2222fa28dfc512a11f03b52fe956"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.054452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lth6k" event={"ID":"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b","Type":"ContainerStarted","Data":"0c191011eb4a7c76c4e5a5f43d6245630c8d092d88ff5cee39fbac7f175ff610"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.056408 4749 generic.go:334] "Generic (PLEG): container finished" podID="48f27676-ef88-40c4-b240-b45c6858d50e" containerID="943a1de35161ad4a943dcd409163e720085c716dca561cb0b12e1b3fd6ed6ad7" exitCode=0 Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.056469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" event={"ID":"48f27676-ef88-40c4-b240-b45c6858d50e","Type":"ContainerDied","Data":"943a1de35161ad4a943dcd409163e720085c716dca561cb0b12e1b3fd6ed6ad7"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.056485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" event={"ID":"48f27676-ef88-40c4-b240-b45c6858d50e","Type":"ContainerStarted","Data":"c52e2ee6e5a8ddfd58eedd0b34efd84af03c174eaf5f43e816dbefc5c3f85530"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.060796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerStarted","Data":"3f7ef6ae2a88ccffd9998836a9a29d9b07fdb36ab4d657829aac182d6d129bb0"} Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.086336 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rsq5p" podStartSLOduration=2.086319096 podStartE2EDuration="2.086319096s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:47.066711787 +0000 UTC m=+1160.428537817" watchObservedRunningTime="2026-02-25 07:36:47.086319096 +0000 UTC m=+1160.448145116" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.163662 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:36:47 crc kubenswrapper[4749]: W0225 07:36:47.170050 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9406fd_e048_480c_8b72_7e4949fb4456.slice/crio-6108e770a1455174b8d39e68b316c3e173c30bfe500d2a29ba48e62a9aba41e8 WatchSource:0}: Error finding container 6108e770a1455174b8d39e68b316c3e173c30bfe500d2a29ba48e62a9aba41e8: Status 404 returned error can't find the container with id 6108e770a1455174b8d39e68b316c3e173c30bfe500d2a29ba48e62a9aba41e8 Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.271279 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lv8wj"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.293099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.375942 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.551709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.551775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.558774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.558888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.558949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.558979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mw8s\" (UniqueName: \"kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s\") pod \"48f27676-ef88-40c4-b240-b45c6858d50e\" (UID: \"48f27676-ef88-40c4-b240-b45c6858d50e\") " Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.586883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s" (OuterVolumeSpecName: "kube-api-access-6mw8s") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "kube-api-access-6mw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.658367 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.680239 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mw8s\" (UniqueName: \"kubernetes.io/projected/48f27676-ef88-40c4-b240-b45c6858d50e-kube-api-access-6mw8s\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.730729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.731106 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.733626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.747323 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config" (OuterVolumeSpecName: "config") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.750855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "48f27676-ef88-40c4-b240-b45c6858d50e" (UID: "48f27676-ef88-40c4-b240-b45c6858d50e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.788611 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.788723 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.788816 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.788877 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.788929 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/48f27676-ef88-40c4-b240-b45c6858d50e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.812921 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.857659 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.870855 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:36:47 crc kubenswrapper[4749]: E0225 07:36:47.871214 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f27676-ef88-40c4-b240-b45c6858d50e" containerName="init" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.871226 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f27676-ef88-40c4-b240-b45c6858d50e" containerName="init" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.871398 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f27676-ef88-40c4-b240-b45c6858d50e" containerName="init" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.872450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.883322 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.890347 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.905968 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.991373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.991412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.991434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jsmt\" (UniqueName: \"kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.991944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:47 crc kubenswrapper[4749]: I0225 07:36:47.991996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.089783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" event={"ID":"48f27676-ef88-40c4-b240-b45c6858d50e","Type":"ContainerDied","Data":"c52e2ee6e5a8ddfd58eedd0b34efd84af03c174eaf5f43e816dbefc5c3f85530"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.089832 4749 scope.go:117] "RemoveContainer" containerID="943a1de35161ad4a943dcd409163e720085c716dca561cb0b12e1b3fd6ed6ad7" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.090024 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-ljf5b" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.093946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.094017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.094046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.094484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.094564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.095173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.095635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.095681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jsmt\" (UniqueName: \"kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.100295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.102803 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerID="69b07e39d33da43a08c383b427d2a1adcbfc5a5ed8fde4cb8de87bfcd29d0337" exitCode=0 Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.102875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" event={"ID":"fbb3099c-45d7-45ed-ae22-724581a55bf9","Type":"ContainerDied","Data":"69b07e39d33da43a08c383b427d2a1adcbfc5a5ed8fde4cb8de87bfcd29d0337"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.102908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" event={"ID":"fbb3099c-45d7-45ed-ae22-724581a55bf9","Type":"ContainerStarted","Data":"f49a8c993b4f4fa846e1412e639cbf0021e2a87c8a6bd32315d576afdc7704b1"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.109587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerStarted","Data":"da9322c015ad4ec2d99888d93a87cc19df9e9294bd812ceb1f5d7c20715390da"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.112644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xld25" event={"ID":"9dd296d6-e025-45bc-9149-d6595e7e683a","Type":"ContainerStarted","Data":"d605cc4f5858ca326aa857395479a2f26cf816ddc65148f463314e2b7b80a30b"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.117297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lv8wj" event={"ID":"857df7ae-9c0e-47b6-9248-2981d1b1d796","Type":"ContainerStarted","Data":"5ebb1bd025f7a445de93cc0182ab12d6b9979e6d7b6788cf4ed4ba3ef12ce2b7"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.117331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lv8wj" event={"ID":"857df7ae-9c0e-47b6-9248-2981d1b1d796","Type":"ContainerStarted","Data":"5de431d311174cda7fee568f45bf6449d80ca0f6e47f4fc152d1a333664b0298"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.119781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jsmt\" (UniqueName: \"kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt\") pod \"horizon-68649bf449-pggn9\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.130887 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerStarted","Data":"6108e770a1455174b8d39e68b316c3e173c30bfe500d2a29ba48e62a9aba41e8"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.134192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbbdf4ccc-w9pws" event={"ID":"67e81fc8-858b-4558-a303-0b851c9f1428","Type":"ContainerStarted","Data":"9a39d9a68053768e6ece4ea01e7d5199229120d1ee80d78c6e63003909a9b8f6"} Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.148391 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-lv8wj" podStartSLOduration=3.148369681 podStartE2EDuration="3.148369681s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:48.140066244 +0000 UTC m=+1161.501892264" watchObservedRunningTime="2026-02-25 07:36:48.148369681 +0000 UTC m=+1161.510195701" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.188413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.203015 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-ljf5b"] Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.230121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:36:48 crc kubenswrapper[4749]: I0225 07:36:48.897585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.163453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" event={"ID":"fbb3099c-45d7-45ed-ae22-724581a55bf9","Type":"ContainerStarted","Data":"9dd488be4eba59a1bbce345ddb637a15abf53905b04b932957e4e6a123f13e64"} Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.164682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.172815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerStarted","Data":"4939a261ca37b44f601307d4a0bf4e62f23f4b3eca6fe12acef219ca587248e8"} Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.174571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68649bf449-pggn9" event={"ID":"0f48c577-ad8e-430f-86b4-8a8b3951d4b4","Type":"ContainerStarted","Data":"70f11ae85d6c8e8dbc918451dec7dc36157487898e25c9f1ab7a2c0f924e1dcb"} Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.192312 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" podStartSLOduration=4.192296754 podStartE2EDuration="4.192296754s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:49.186002157 +0000 UTC m=+1162.547828177" watchObservedRunningTime="2026-02-25 07:36:49.192296754 +0000 UTC m=+1162.554122774" Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.204210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerStarted","Data":"f019744e1d46a0e6e19b1adde9e5eaf7d44c2770b083f9ee5787f83d7c434f14"} Feb 25 07:36:49 crc kubenswrapper[4749]: I0225 07:36:49.340933 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f27676-ef88-40c4-b240-b45c6858d50e" path="/var/lib/kubelet/pods/48f27676-ef88-40c4-b240-b45c6858d50e/volumes" Feb 25 07:36:50 crc kubenswrapper[4749]: I0225 07:36:50.214857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerStarted","Data":"0c3ce9c09b9516f5be145f448717b9bc697dac68033ba67e245c875c056ebe9d"} Feb 25 07:36:50 crc kubenswrapper[4749]: I0225 07:36:50.214949 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-log" containerID="cri-o://f019744e1d46a0e6e19b1adde9e5eaf7d44c2770b083f9ee5787f83d7c434f14" gracePeriod=30 Feb 25 07:36:50 crc kubenswrapper[4749]: I0225 07:36:50.215025 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-httpd" containerID="cri-o://0c3ce9c09b9516f5be145f448717b9bc697dac68033ba67e245c875c056ebe9d" gracePeriod=30 Feb 25 07:36:50 crc kubenswrapper[4749]: I0225 07:36:50.246764 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.246717299 podStartE2EDuration="5.246717299s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:50.241295774 +0000 UTC m=+1163.603121794" watchObservedRunningTime="2026-02-25 07:36:50.246717299 +0000 UTC m=+1163.608543319" Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.225994 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerID="0c3ce9c09b9516f5be145f448717b9bc697dac68033ba67e245c875c056ebe9d" exitCode=0 Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.226225 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerID="f019744e1d46a0e6e19b1adde9e5eaf7d44c2770b083f9ee5787f83d7c434f14" exitCode=143 Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.226060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerDied","Data":"0c3ce9c09b9516f5be145f448717b9bc697dac68033ba67e245c875c056ebe9d"} Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.226283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerDied","Data":"f019744e1d46a0e6e19b1adde9e5eaf7d44c2770b083f9ee5787f83d7c434f14"} Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.231884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerStarted","Data":"1815482271fce0d90f5605ada7d75de7bb45a6c5b09d9ad85fa937e95050f048"} Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.231998 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-log" containerID="cri-o://4939a261ca37b44f601307d4a0bf4e62f23f4b3eca6fe12acef219ca587248e8" gracePeriod=30 Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.232389 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-httpd" containerID="cri-o://1815482271fce0d90f5605ada7d75de7bb45a6c5b09d9ad85fa937e95050f048" gracePeriod=30 Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.236051 4749 generic.go:334] "Generic (PLEG): container finished" podID="4bc73d94-4436-43a8-963b-50efb29c9f8c" containerID="daf1229746c30846ae1262c9edc3c5358455d6adaedb0192eca6cb9a11e5168e" exitCode=0 Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.236071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsq5p" event={"ID":"4bc73d94-4436-43a8-963b-50efb29c9f8c","Type":"ContainerDied","Data":"daf1229746c30846ae1262c9edc3c5358455d6adaedb0192eca6cb9a11e5168e"} Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.260583 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.260567263 podStartE2EDuration="6.260567263s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:36:51.248729767 +0000 UTC m=+1164.610555787" watchObservedRunningTime="2026-02-25 07:36:51.260567263 +0000 UTC m=+1164.622393283" Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.672005 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:36:51 crc kubenswrapper[4749]: I0225 07:36:51.672049 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:36:52 crc kubenswrapper[4749]: I0225 07:36:52.245746 4749 generic.go:334] "Generic (PLEG): container finished" podID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerID="1815482271fce0d90f5605ada7d75de7bb45a6c5b09d9ad85fa937e95050f048" exitCode=0 Feb 25 07:36:52 crc kubenswrapper[4749]: I0225 07:36:52.246028 4749 generic.go:334] "Generic (PLEG): container finished" podID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerID="4939a261ca37b44f601307d4a0bf4e62f23f4b3eca6fe12acef219ca587248e8" exitCode=143 Feb 25 07:36:52 crc kubenswrapper[4749]: I0225 07:36:52.246205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerDied","Data":"1815482271fce0d90f5605ada7d75de7bb45a6c5b09d9ad85fa937e95050f048"} Feb 25 07:36:52 crc kubenswrapper[4749]: I0225 07:36:52.246232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerDied","Data":"4939a261ca37b44f601307d4a0bf4e62f23f4b3eca6fe12acef219ca587248e8"} Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.615420 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.646665 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.647964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.652374 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.675011 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.748057 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.763835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75597b5c88-58jkm"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.765521 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzvk\" (UniqueName: \"kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.777646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.783211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75597b5c88-58jkm"] Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-combined-ca-bundle\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzvk\" (UniqueName: \"kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-tls-certs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmvp\" (UniqueName: \"kubernetes.io/projected/43b78d96-31fe-4729-aacf-09c66c121861-kube-api-access-phmvp\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-config-data\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-secret-key\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-scripts\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.879982 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b78d96-31fe-4729-aacf-09c66c121861-logs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.880014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.880017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.880729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.881716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.886104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.886870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.889245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.901184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzvk\" (UniqueName: \"kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk\") pod \"horizon-6955447f7b-mp5f9\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.971932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.982005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-scripts\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.982252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b78d96-31fe-4729-aacf-09c66c121861-logs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.982742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-combined-ca-bundle\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.982874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-tls-certs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.983053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmvp\" (UniqueName: \"kubernetes.io/projected/43b78d96-31fe-4729-aacf-09c66c121861-kube-api-access-phmvp\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.983318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43b78d96-31fe-4729-aacf-09c66c121861-logs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.983628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-scripts\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.983848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-config-data\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.985405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-secret-key\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.986210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43b78d96-31fe-4729-aacf-09c66c121861-config-data\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.986930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-tls-certs\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.992010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-horizon-secret-key\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:54 crc kubenswrapper[4749]: I0225 07:36:54.992812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b78d96-31fe-4729-aacf-09c66c121861-combined-ca-bundle\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:55 crc kubenswrapper[4749]: I0225 07:36:55.004125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmvp\" (UniqueName: \"kubernetes.io/projected/43b78d96-31fe-4729-aacf-09c66c121861-kube-api-access-phmvp\") pod \"horizon-75597b5c88-58jkm\" (UID: \"43b78d96-31fe-4729-aacf-09c66c121861\") " pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:55 crc kubenswrapper[4749]: I0225 07:36:55.087655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:36:56 crc kubenswrapper[4749]: I0225 07:36:56.566159 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:36:56 crc kubenswrapper[4749]: I0225 07:36:56.647474 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:36:56 crc kubenswrapper[4749]: I0225 07:36:56.648357 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" containerID="cri-o://6efdfd678cbee90bbe1332bfd07b2655ac1cb17e027801ccf8e38d1717e0b54c" gracePeriod=10 Feb 25 07:36:57 crc kubenswrapper[4749]: I0225 07:36:57.300182 4749 generic.go:334] "Generic (PLEG): container finished" podID="62e879c3-255b-4499-b388-41ffdaf75f79" containerID="6efdfd678cbee90bbe1332bfd07b2655ac1cb17e027801ccf8e38d1717e0b54c" exitCode=0 Feb 25 07:36:57 crc kubenswrapper[4749]: I0225 07:36:57.300234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" event={"ID":"62e879c3-255b-4499-b388-41ffdaf75f79","Type":"ContainerDied","Data":"6efdfd678cbee90bbe1332bfd07b2655ac1cb17e027801ccf8e38d1717e0b54c"} Feb 25 07:37:00 crc kubenswrapper[4749]: I0225 07:37:00.418531 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Feb 25 07:37:01 crc kubenswrapper[4749]: E0225 07:37:01.033208 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 07:37:01 crc kubenswrapper[4749]: E0225 07:37:01.033777 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncch668h5ffhb9h89h589h95hch5ddh546h79h58chc4h67ch5b6h659h5h55dh7bh669h58ch6bh89h675h84h694h5f4h8dhb4h546h5cfh687q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ktxq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bbbdf4ccc-w9pws_openstack(67e81fc8-858b-4558-a303-0b851c9f1428): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:01 crc kubenswrapper[4749]: E0225 07:37:01.035984 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-bbbdf4ccc-w9pws" podUID="67e81fc8-858b-4558-a303-0b851c9f1428" Feb 25 07:37:02 crc kubenswrapper[4749]: E0225 07:37:02.770312 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 07:37:02 crc kubenswrapper[4749]: E0225 07:37:02.770984 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n668hbfh76h645h59dhb6h66dh64bhf9h84h5f4hdbh5d4h5f8h6dhf4hb5h588hc6h668hfbh599h574h7ch5fh695h54dhcchc7h669hdfh6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7d894,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-788cf5dc45-snhxc_openstack(a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:02 crc kubenswrapper[4749]: E0225 07:37:02.775438 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-788cf5dc45-snhxc" podUID="a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.806706 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.813198 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880523 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bwgd\" (UniqueName: \"kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsjg\" (UniqueName: \"kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880928 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.880981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs\") pod \"1c9406fd-e048-480c-8b72-7e4949fb4456\" (UID: \"1c9406fd-e048-480c-8b72-7e4949fb4456\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.881207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts\") pod \"4bc73d94-4436-43a8-963b-50efb29c9f8c\" (UID: \"4bc73d94-4436-43a8-963b-50efb29c9f8c\") " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.882083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.882265 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs" (OuterVolumeSpecName: "logs") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.889721 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.890287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.892188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg" (OuterVolumeSpecName: "kube-api-access-9vsjg") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "kube-api-access-9vsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.899711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts" (OuterVolumeSpecName: "scripts") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.911542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd" (OuterVolumeSpecName: "kube-api-access-4bwgd") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "kube-api-access-4bwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.915205 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts" (OuterVolumeSpecName: "scripts") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.916016 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.927193 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.950729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data" (OuterVolumeSpecName: "config-data") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.954822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data" (OuterVolumeSpecName: "config-data") pod "4bc73d94-4436-43a8-963b-50efb29c9f8c" (UID: "4bc73d94-4436-43a8-963b-50efb29c9f8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.955915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.967742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c9406fd-e048-480c-8b72-7e4949fb4456" (UID: "1c9406fd-e048-480c-8b72-7e4949fb4456"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983508 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983531 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983540 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983548 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983557 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983565 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983605 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983617 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c9406fd-e048-480c-8b72-7e4949fb4456-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983627 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983635 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4bc73d94-4436-43a8-963b-50efb29c9f8c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983643 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983650 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bwgd\" (UniqueName: \"kubernetes.io/projected/4bc73d94-4436-43a8-963b-50efb29c9f8c-kube-api-access-4bwgd\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983659 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c9406fd-e048-480c-8b72-7e4949fb4456-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:02 crc kubenswrapper[4749]: I0225 07:37:02.983668 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vsjg\" (UniqueName: \"kubernetes.io/projected/1c9406fd-e048-480c-8b72-7e4949fb4456-kube-api-access-9vsjg\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.000918 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.085662 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.318927 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.325283 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.325437 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hngrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xld25_openstack(9dd296d6-e025-45bc-9149-d6595e7e683a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.326576 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xld25" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.352172 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.352307 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h5b7h58chcfh59bh7ch5c5h674hcbh659h644h664h85h5cbh6fh65dh5dch89h55fhc4hb5h68dh97h5bhc9h556h65fh5dh597h56bhbbh5c9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5jsmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-68649bf449-pggn9_openstack(0f48c577-ad8e-430f-86b4-8a8b3951d4b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.356775 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-68649bf449-pggn9" podUID="0f48c577-ad8e-430f-86b4-8a8b3951d4b4" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.391555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.401901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kprx\" (UniqueName: \"kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.401351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts" (OuterVolumeSpecName: "scripts") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.401978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run\") pod \"22aa8f39-2a5b-478b-a714-e4762ecfac91\" (UID: \"22aa8f39-2a5b-478b-a714-e4762ecfac91\") " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402731 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.402829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.403063 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs" (OuterVolumeSpecName: "logs") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.416001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx" (OuterVolumeSpecName: "kube-api-access-4kprx") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "kube-api-access-4kprx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.445399 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.445517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.445403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22aa8f39-2a5b-478b-a714-e4762ecfac91","Type":"ContainerDied","Data":"da9322c015ad4ec2d99888d93a87cc19df9e9294bd812ceb1f5d7c20715390da"} Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.445586 4749 scope.go:117] "RemoveContainer" containerID="1815482271fce0d90f5605ada7d75de7bb45a6c5b09d9ad85fa937e95050f048" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.467845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsq5p" event={"ID":"4bc73d94-4436-43a8-963b-50efb29c9f8c","Type":"ContainerDied","Data":"aacfe209e8d08dae2fbba222dc8425308a6a2d71c02f4db596e300c3455c4f7d"} Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.467884 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aacfe209e8d08dae2fbba222dc8425308a6a2d71c02f4db596e300c3455c4f7d" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.467988 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsq5p" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.470790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.474730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.477253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c9406fd-e048-480c-8b72-7e4949fb4456","Type":"ContainerDied","Data":"6108e770a1455174b8d39e68b316c3e173c30bfe500d2a29ba48e62a9aba41e8"} Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.477358 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.481993 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xld25" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505634 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kprx\" (UniqueName: \"kubernetes.io/projected/22aa8f39-2a5b-478b-a714-e4762ecfac91-kube-api-access-4kprx\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505670 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505680 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505689 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505698 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.505705 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22aa8f39-2a5b-478b-a714-e4762ecfac91-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.538969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data" (OuterVolumeSpecName: "config-data") pod "22aa8f39-2a5b-478b-a714-e4762ecfac91" (UID: "22aa8f39-2a5b-478b-a714-e4762ecfac91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.569530 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.592154 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.607180 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.607210 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22aa8f39-2a5b-478b-a714-e4762ecfac91-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.608917 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.617793 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.618204 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618217 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.618236 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc73d94-4436-43a8-963b-50efb29c9f8c" containerName="keystone-bootstrap" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618242 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc73d94-4436-43a8-963b-50efb29c9f8c" containerName="keystone-bootstrap" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.618257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618263 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.618277 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618283 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: E0225 07:37:03.618293 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618299 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618470 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618481 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618495 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-log" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618502 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc73d94-4436-43a8-963b-50efb29c9f8c" containerName="keystone-bootstrap" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.618512 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" containerName="glance-httpd" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.619389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.622358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.622549 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.624045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.708621 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.708657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.708681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.708704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.708830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.709027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.709115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5rnm\" (UniqueName: \"kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.709208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.792935 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5rnm\" (UniqueName: \"kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811445 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.811474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.812354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.812305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.812005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.815204 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.819282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.825538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.828631 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.830158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.832554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5rnm\" (UniqueName: \"kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.836122 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.836352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.842085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.845243 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.861244 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.871665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.914569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5z9r\" (UniqueName: \"kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.915485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.917099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.942297 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.956017 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rsq5p"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.976340 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rsq5p"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.993714 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kmwww"] Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.994894 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:03 crc kubenswrapper[4749]: I0225 07:37:03.997486 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:03.997681 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sr7tv" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:03.997715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:03.997806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.005364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5z9r\" (UniqueName: \"kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.018800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jngsx\" (UniqueName: \"kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.019196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmwww"] Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.019229 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.020220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.020309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.024322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.025388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.034948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.042529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.045324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5z9r\" (UniqueName: \"kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.078224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.119820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.119889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jngsx\" (UniqueName: \"kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.119942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.119969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.119988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.120028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.124124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.132806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.134226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.134354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.134432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.137568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jngsx\" (UniqueName: \"kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx\") pod \"keystone-bootstrap-kmwww\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.229944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:04 crc kubenswrapper[4749]: I0225 07:37:04.398367 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:05 crc kubenswrapper[4749]: I0225 07:37:05.344426 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c9406fd-e048-480c-8b72-7e4949fb4456" path="/var/lib/kubelet/pods/1c9406fd-e048-480c-8b72-7e4949fb4456/volumes" Feb 25 07:37:05 crc kubenswrapper[4749]: I0225 07:37:05.347178 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22aa8f39-2a5b-478b-a714-e4762ecfac91" path="/var/lib/kubelet/pods/22aa8f39-2a5b-478b-a714-e4762ecfac91/volumes" Feb 25 07:37:05 crc kubenswrapper[4749]: I0225 07:37:05.348523 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bc73d94-4436-43a8-963b-50efb29c9f8c" path="/var/lib/kubelet/pods/4bc73d94-4436-43a8-963b-50efb29c9f8c/volumes" Feb 25 07:37:10 crc kubenswrapper[4749]: I0225 07:37:10.419637 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 25 07:37:10 crc kubenswrapper[4749]: I0225 07:37:10.552522 4749 generic.go:334] "Generic (PLEG): container finished" podID="857df7ae-9c0e-47b6-9248-2981d1b1d796" containerID="5ebb1bd025f7a445de93cc0182ab12d6b9979e6d7b6788cf4ed4ba3ef12ce2b7" exitCode=0 Feb 25 07:37:10 crc kubenswrapper[4749]: I0225 07:37:10.552580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lv8wj" event={"ID":"857df7ae-9c0e-47b6-9248-2981d1b1d796","Type":"ContainerDied","Data":"5ebb1bd025f7a445de93cc0182ab12d6b9979e6d7b6788cf4ed4ba3ef12ce2b7"} Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.218638 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts\") pod \"67e81fc8-858b-4558-a303-0b851c9f1428\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data\") pod \"67e81fc8-858b-4558-a303-0b851c9f1428\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktxq2\" (UniqueName: \"kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2\") pod \"67e81fc8-858b-4558-a303-0b851c9f1428\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs\") pod \"67e81fc8-858b-4558-a303-0b851c9f1428\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key\") pod \"67e81fc8-858b-4558-a303-0b851c9f1428\" (UID: \"67e81fc8-858b-4558-a303-0b851c9f1428\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.364994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data" (OuterVolumeSpecName: "config-data") pod "67e81fc8-858b-4558-a303-0b851c9f1428" (UID: "67e81fc8-858b-4558-a303-0b851c9f1428"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.365149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts" (OuterVolumeSpecName: "scripts") pod "67e81fc8-858b-4558-a303-0b851c9f1428" (UID: "67e81fc8-858b-4558-a303-0b851c9f1428"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.365242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs" (OuterVolumeSpecName: "logs") pod "67e81fc8-858b-4558-a303-0b851c9f1428" (UID: "67e81fc8-858b-4558-a303-0b851c9f1428"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.372520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "67e81fc8-858b-4558-a303-0b851c9f1428" (UID: "67e81fc8-858b-4558-a303-0b851c9f1428"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.372541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2" (OuterVolumeSpecName: "kube-api-access-ktxq2") pod "67e81fc8-858b-4558-a303-0b851c9f1428" (UID: "67e81fc8-858b-4558-a303-0b851c9f1428"). InnerVolumeSpecName "kube-api-access-ktxq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.467247 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.467291 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67e81fc8-858b-4558-a303-0b851c9f1428-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.467313 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktxq2\" (UniqueName: \"kubernetes.io/projected/67e81fc8-858b-4558-a303-0b851c9f1428-kube-api-access-ktxq2\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.467333 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67e81fc8-858b-4558-a303-0b851c9f1428-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.467351 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/67e81fc8-858b-4558-a303-0b851c9f1428-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.560094 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbbdf4ccc-w9pws" event={"ID":"67e81fc8-858b-4558-a303-0b851c9f1428","Type":"ContainerDied","Data":"9a39d9a68053768e6ece4ea01e7d5199229120d1ee80d78c6e63003909a9b8f6"} Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.560157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbbdf4ccc-w9pws" Feb 25 07:37:11 crc kubenswrapper[4749]: E0225 07:37:11.585121 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 25 07:37:11 crc kubenswrapper[4749]: E0225 07:37:11.585437 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n86h5cbh658h5h598h694h688h65ch5f5h68dh5cdh5ch67dh585h5b7hdfh596h64fhc9h575h5f8h84h567h587h5ffhd6hch655h649h58fh697h679q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsmwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ec69779f-8eea-4feb-b9e3-a9f2d2bdabce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.629298 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.629362 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bbbdf4ccc-w9pws"] Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.733866 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.740035 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.759334 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.872946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.872984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data\") pod \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts\") pod \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873071 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts\") pod \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key\") pod \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key\") pod \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jsmt\" (UniqueName: \"kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt\") pod \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs\") pod \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data\") pod \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\" (UID: \"0f48c577-ad8e-430f-86b4-8a8b3951d4b4\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xfsp\" (UniqueName: \"kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp\") pod \"62e879c3-255b-4499-b388-41ffdaf75f79\" (UID: \"62e879c3-255b-4499-b388-41ffdaf75f79\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873382 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d894\" (UniqueName: \"kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894\") pod \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.873406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs\") pod \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\" (UID: \"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a\") " Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.874046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs" (OuterVolumeSpecName: "logs") pod "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" (UID: "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.874291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs" (OuterVolumeSpecName: "logs") pod "0f48c577-ad8e-430f-86b4-8a8b3951d4b4" (UID: "0f48c577-ad8e-430f-86b4-8a8b3951d4b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.876135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data" (OuterVolumeSpecName: "config-data") pod "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" (UID: "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.878315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts" (OuterVolumeSpecName: "scripts") pod "0f48c577-ad8e-430f-86b4-8a8b3951d4b4" (UID: "0f48c577-ad8e-430f-86b4-8a8b3951d4b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.878576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" (UID: "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.878578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt" (OuterVolumeSpecName: "kube-api-access-5jsmt") pod "0f48c577-ad8e-430f-86b4-8a8b3951d4b4" (UID: "0f48c577-ad8e-430f-86b4-8a8b3951d4b4"). InnerVolumeSpecName "kube-api-access-5jsmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.878768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data" (OuterVolumeSpecName: "config-data") pod "0f48c577-ad8e-430f-86b4-8a8b3951d4b4" (UID: "0f48c577-ad8e-430f-86b4-8a8b3951d4b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.879070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts" (OuterVolumeSpecName: "scripts") pod "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" (UID: "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.881407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0f48c577-ad8e-430f-86b4-8a8b3951d4b4" (UID: "0f48c577-ad8e-430f-86b4-8a8b3951d4b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.881869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp" (OuterVolumeSpecName: "kube-api-access-7xfsp") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "kube-api-access-7xfsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.882059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894" (OuterVolumeSpecName: "kube-api-access-7d894") pod "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" (UID: "a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a"). InnerVolumeSpecName "kube-api-access-7d894". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.914507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.917583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config" (OuterVolumeSpecName: "config") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.918079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.918527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.925130 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62e879c3-255b-4499-b388-41ffdaf75f79" (UID: "62e879c3-255b-4499-b388-41ffdaf75f79"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975003 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975036 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975046 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xfsp\" (UniqueName: \"kubernetes.io/projected/62e879c3-255b-4499-b388-41ffdaf75f79-kube-api-access-7xfsp\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975057 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d894\" (UniqueName: \"kubernetes.io/projected/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-kube-api-access-7d894\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975067 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975077 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975084 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975092 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975103 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975111 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975119 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975131 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975141 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jsmt\" (UniqueName: \"kubernetes.io/projected/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-kube-api-access-5jsmt\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975150 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f48c577-ad8e-430f-86b4-8a8b3951d4b4-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975160 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:11 crc kubenswrapper[4749]: I0225 07:37:11.975170 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e879c3-255b-4499-b388-41ffdaf75f79-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.573121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-788cf5dc45-snhxc" event={"ID":"a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a","Type":"ContainerDied","Data":"bcec80b9b3ad29c2203ff98a9310375f234d2222fa28dfc512a11f03b52fe956"} Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.573404 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-788cf5dc45-snhxc" Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.578398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" event={"ID":"62e879c3-255b-4499-b388-41ffdaf75f79","Type":"ContainerDied","Data":"d265883baf82db4f707b1ef051b99b874fbdcca788021cfb7f8f8675c7286eae"} Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.578471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.582764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68649bf449-pggn9" event={"ID":"0f48c577-ad8e-430f-86b4-8a8b3951d4b4","Type":"ContainerDied","Data":"70f11ae85d6c8e8dbc918451dec7dc36157487898e25c9f1ab7a2c0f924e1dcb"} Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.582844 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68649bf449-pggn9" Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.647803 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.663271 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-c8nx7"] Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.687062 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.695979 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-788cf5dc45-snhxc"] Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.711787 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:37:12 crc kubenswrapper[4749]: I0225 07:37:12.719432 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68649bf449-pggn9"] Feb 25 07:37:13 crc kubenswrapper[4749]: I0225 07:37:13.337774 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f48c577-ad8e-430f-86b4-8a8b3951d4b4" path="/var/lib/kubelet/pods/0f48c577-ad8e-430f-86b4-8a8b3951d4b4/volumes" Feb 25 07:37:13 crc kubenswrapper[4749]: I0225 07:37:13.339041 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" path="/var/lib/kubelet/pods/62e879c3-255b-4499-b388-41ffdaf75f79/volumes" Feb 25 07:37:13 crc kubenswrapper[4749]: I0225 07:37:13.340563 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e81fc8-858b-4558-a303-0b851c9f1428" path="/var/lib/kubelet/pods/67e81fc8-858b-4558-a303-0b851c9f1428/volumes" Feb 25 07:37:13 crc kubenswrapper[4749]: I0225 07:37:13.341847 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a" path="/var/lib/kubelet/pods/a7e319a6-d6fd-4ed0-8f22-146a08b2fc6a/volumes" Feb 25 07:37:14 crc kubenswrapper[4749]: E0225 07:37:14.816478 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 25 07:37:14 crc kubenswrapper[4749]: E0225 07:37:14.817147 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwbhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lth6k_openstack(8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 07:37:14 crc kubenswrapper[4749]: E0225 07:37:14.819016 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lth6k" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" Feb 25 07:37:14 crc kubenswrapper[4749]: I0225 07:37:14.832308 4749 scope.go:117] "RemoveContainer" containerID="4939a261ca37b44f601307d4a0bf4e62f23f4b3eca6fe12acef219ca587248e8" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.029055 4749 scope.go:117] "RemoveContainer" containerID="0c3ce9c09b9516f5be145f448717b9bc697dac68033ba67e245c875c056ebe9d" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.034777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.061191 4749 scope.go:117] "RemoveContainer" containerID="f019744e1d46a0e6e19b1adde9e5eaf7d44c2770b083f9ee5787f83d7c434f14" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.108775 4749 scope.go:117] "RemoveContainer" containerID="6efdfd678cbee90bbe1332bfd07b2655ac1cb17e027801ccf8e38d1717e0b54c" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.135884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config\") pod \"857df7ae-9c0e-47b6-9248-2981d1b1d796\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.136003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngltf\" (UniqueName: \"kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf\") pod \"857df7ae-9c0e-47b6-9248-2981d1b1d796\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.136048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle\") pod \"857df7ae-9c0e-47b6-9248-2981d1b1d796\" (UID: \"857df7ae-9c0e-47b6-9248-2981d1b1d796\") " Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.140663 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf" (OuterVolumeSpecName: "kube-api-access-ngltf") pod "857df7ae-9c0e-47b6-9248-2981d1b1d796" (UID: "857df7ae-9c0e-47b6-9248-2981d1b1d796"). InnerVolumeSpecName "kube-api-access-ngltf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.142629 4749 scope.go:117] "RemoveContainer" containerID="38f9bd58fefa5938b6c403a4ae23da4286550ee0f6e4f3029426b23808456da1" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.161706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config" (OuterVolumeSpecName: "config") pod "857df7ae-9c0e-47b6-9248-2981d1b1d796" (UID: "857df7ae-9c0e-47b6-9248-2981d1b1d796"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.162242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "857df7ae-9c0e-47b6-9248-2981d1b1d796" (UID: "857df7ae-9c0e-47b6-9248-2981d1b1d796"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.238957 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.238996 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngltf\" (UniqueName: \"kubernetes.io/projected/857df7ae-9c0e-47b6-9248-2981d1b1d796-kube-api-access-ngltf\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.239010 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857df7ae-9c0e-47b6-9248-2981d1b1d796-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.374917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.388555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75597b5c88-58jkm"] Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.421832 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-c8nx7" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.539941 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kmwww"] Feb 25 07:37:15 crc kubenswrapper[4749]: W0225 07:37:15.588306 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54c92c50_880d_4bf0_823f_16f13d25066b.slice/crio-f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35 WatchSource:0}: Error finding container f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35: Status 404 returned error can't find the container with id f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35 Feb 25 07:37:15 crc kubenswrapper[4749]: W0225 07:37:15.593392 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18514a86_67f9_4c66_b5bb_59b4a2f34627.slice/crio-ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7 WatchSource:0}: Error finding container ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7: Status 404 returned error can't find the container with id ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7 Feb 25 07:37:15 crc kubenswrapper[4749]: W0225 07:37:15.596842 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43b78d96_31fe_4729_aacf_09c66c121861.slice/crio-954fb0e1f6a3ed68415816af70b41420b02bf4d3e139d69a6bc598c01dc6154b WatchSource:0}: Error finding container 954fb0e1f6a3ed68415816af70b41420b02bf4d3e139d69a6bc598c01dc6154b: Status 404 returned error can't find the container with id 954fb0e1f6a3ed68415816af70b41420b02bf4d3e139d69a6bc598c01dc6154b Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.611255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmwww" event={"ID":"18514a86-67f9-4c66-b5bb-59b4a2f34627","Type":"ContainerStarted","Data":"ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.614349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerStarted","Data":"f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.616998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lv8wj" event={"ID":"857df7ae-9c0e-47b6-9248-2981d1b1d796","Type":"ContainerDied","Data":"5de431d311174cda7fee568f45bf6449d80ca0f6e47f4fc152d1a333664b0298"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.617041 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5de431d311174cda7fee568f45bf6449d80ca0f6e47f4fc152d1a333664b0298" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.617102 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lv8wj" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.619472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv26f" event={"ID":"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff","Type":"ContainerStarted","Data":"202531abc1663b771c9da627e2e6781d517f7d87e3c1d51ca8c45f6102bdc54f"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.628208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xld25" event={"ID":"9dd296d6-e025-45bc-9149-d6595e7e683a","Type":"ContainerStarted","Data":"0b8f218137bc510a74de90b8782b57aa796b646e29b704b0a8270409fc1b6d3c"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.630247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75597b5c88-58jkm" event={"ID":"43b78d96-31fe-4729-aacf-09c66c121861","Type":"ContainerStarted","Data":"954fb0e1f6a3ed68415816af70b41420b02bf4d3e139d69a6bc598c01dc6154b"} Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.639053 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hv26f" podStartSLOduration=5.848796368 podStartE2EDuration="30.639036859s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="2026-02-25 07:36:46.842737244 +0000 UTC m=+1160.204563264" lastFinishedPulling="2026-02-25 07:37:11.632977725 +0000 UTC m=+1184.994803755" observedRunningTime="2026-02-25 07:37:15.637069282 +0000 UTC m=+1188.998895302" watchObservedRunningTime="2026-02-25 07:37:15.639036859 +0000 UTC m=+1189.000862879" Feb 25 07:37:15 crc kubenswrapper[4749]: E0225 07:37:15.654855 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lth6k" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.666811 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xld25" podStartSLOduration=2.807435228 podStartE2EDuration="30.666788351s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="2026-02-25 07:36:47.098626743 +0000 UTC m=+1160.460452763" lastFinishedPulling="2026-02-25 07:37:14.957979856 +0000 UTC m=+1188.319805886" observedRunningTime="2026-02-25 07:37:15.658512751 +0000 UTC m=+1189.020338811" watchObservedRunningTime="2026-02-25 07:37:15.666788351 +0000 UTC m=+1189.028614381" Feb 25 07:37:15 crc kubenswrapper[4749]: I0225 07:37:15.692371 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.264045 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:16 crc kubenswrapper[4749]: E0225 07:37:16.269069 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="init" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.269154 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="init" Feb 25 07:37:16 crc kubenswrapper[4749]: E0225 07:37:16.269232 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857df7ae-9c0e-47b6-9248-2981d1b1d796" containerName="neutron-db-sync" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.269285 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="857df7ae-9c0e-47b6-9248-2981d1b1d796" containerName="neutron-db-sync" Feb 25 07:37:16 crc kubenswrapper[4749]: E0225 07:37:16.269347 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.269398 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.269639 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="62e879c3-255b-4499-b388-41ffdaf75f79" containerName="dnsmasq-dns" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.269760 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="857df7ae-9c0e-47b6-9248-2981d1b1d796" containerName="neutron-db-sync" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.271253 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.282151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.390779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.390884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtbd\" (UniqueName: \"kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.390935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.390997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.391108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.391139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.405651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.407066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.411612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.411680 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.411735 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tdmhl" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.411803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.413557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtbd\" (UniqueName: \"kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495574 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.495668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.496395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.497775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.498134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.498315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.499015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.518565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtbd\" (UniqueName: \"kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd\") pod \"dnsmasq-dns-6b7b667979-qnn4j\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.575877 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.596859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.597714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4c8\" (UniqueName: \"kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.597767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.597798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.597836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.654086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.670085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerStarted","Data":"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.679810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerStarted","Data":"4d8d6ff5f005eda3d7bac28c9113d2df900871e30469621e74315136f4865155"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.699228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4c8\" (UniqueName: \"kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.699295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.699324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.699356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.699409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.710058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.710378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.712550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.712774 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75597b5c88-58jkm" event={"ID":"43b78d96-31fe-4729-aacf-09c66c121861","Type":"ContainerStarted","Data":"b3b80ed69cb78658bebd0cd82c8660ad82c2447f412653c9bbc5f470ee79c5e3"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.715767 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.716671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerStarted","Data":"6ecdc80088d758d8064604d275b68591330c03c74e584f8523b3ef0bfb16f8eb"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.716708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerStarted","Data":"dc4d0acb478851e6d5351b9cc3294410c73e844b905ec09032978ffc93fe4a7a"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.722837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmwww" event={"ID":"18514a86-67f9-4c66-b5bb-59b4a2f34627","Type":"ContainerStarted","Data":"e86715d496cbb6b72a1e08498898cae5fc9ebe0a100509ac2df6c1b23c436922"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.727062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerStarted","Data":"a8fadc4888e188860b266e79522b2d80458c0417ffe4c4080d67ad7ebc843fee"} Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.732304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4c8\" (UniqueName: \"kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8\") pod \"neutron-7675646668-99wj9\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.749919 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kmwww" podStartSLOduration=13.74990066 podStartE2EDuration="13.74990066s" podCreationTimestamp="2026-02-25 07:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:16.742990964 +0000 UTC m=+1190.104816974" watchObservedRunningTime="2026-02-25 07:37:16.74990066 +0000 UTC m=+1190.111726680" Feb 25 07:37:16 crc kubenswrapper[4749]: I0225 07:37:16.761668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.368694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.541387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:37:17 crc kubenswrapper[4749]: W0225 07:37:17.549101 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd762ab38_78a5_4da4_a1a1_1831e6e069d3.slice/crio-f12adb14f9b67f8afce1fce27bc74052ed1fc254471765584823c02e29bf297d WatchSource:0}: Error finding container f12adb14f9b67f8afce1fce27bc74052ed1fc254471765584823c02e29bf297d: Status 404 returned error can't find the container with id f12adb14f9b67f8afce1fce27bc74052ed1fc254471765584823c02e29bf297d Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.749376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerStarted","Data":"e3018e6638b8d524d26767e09f5a4fdf417056f3b37a3e63abc914f28fb8ff5e"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.756378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerStarted","Data":"269cf05610217219000c47b8623644aac4dfeb65efba19487d9ef0872efbf736"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.762615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75597b5c88-58jkm" event={"ID":"43b78d96-31fe-4729-aacf-09c66c121861","Type":"ContainerStarted","Data":"ace24f93dcb0c8deac7d2378f46829bb2fcdd2ebb193812d4bf5dc9bf8e4cb69"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.766552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerStarted","Data":"f12adb14f9b67f8afce1fce27bc74052ed1fc254471765584823c02e29bf297d"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.784168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6955447f7b-mp5f9" podStartSLOduration=23.374261861 podStartE2EDuration="23.784152787s" podCreationTimestamp="2026-02-25 07:36:54 +0000 UTC" firstStartedPulling="2026-02-25 07:37:15.593521487 +0000 UTC m=+1188.955347517" lastFinishedPulling="2026-02-25 07:37:16.003412423 +0000 UTC m=+1189.365238443" observedRunningTime="2026-02-25 07:37:17.783521852 +0000 UTC m=+1191.145347892" watchObservedRunningTime="2026-02-25 07:37:17.784152787 +0000 UTC m=+1191.145978807" Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.786023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerStarted","Data":"ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.818035 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75597b5c88-58jkm" podStartSLOduration=23.275644012 podStartE2EDuration="23.818020117s" podCreationTimestamp="2026-02-25 07:36:54 +0000 UTC" firstStartedPulling="2026-02-25 07:37:15.600841634 +0000 UTC m=+1188.962667654" lastFinishedPulling="2026-02-25 07:37:16.143217729 +0000 UTC m=+1189.505043759" observedRunningTime="2026-02-25 07:37:17.806447827 +0000 UTC m=+1191.168273857" watchObservedRunningTime="2026-02-25 07:37:17.818020117 +0000 UTC m=+1191.179846137" Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.839259 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.839242791 podStartE2EDuration="14.839242791s" podCreationTimestamp="2026-02-25 07:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:17.831010302 +0000 UTC m=+1191.192836322" watchObservedRunningTime="2026-02-25 07:37:17.839242791 +0000 UTC m=+1191.201068811" Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.865012 4749 generic.go:334] "Generic (PLEG): container finished" podID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerID="23d38970ce46055165d4f76c0bfdfa3cf725a07d8f0067141040decc707d4e18" exitCode=0 Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.865783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" event={"ID":"01fcceaf-f66a-4144-af0e-d48af844d7aa","Type":"ContainerDied","Data":"23d38970ce46055165d4f76c0bfdfa3cf725a07d8f0067141040decc707d4e18"} Feb 25 07:37:17 crc kubenswrapper[4749]: I0225 07:37:17.865807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" event={"ID":"01fcceaf-f66a-4144-af0e-d48af844d7aa","Type":"ContainerStarted","Data":"f76f7c399a5d44456016dbc6f5910190508c1df84749a497fd503c861687e208"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.879090 4749 generic.go:334] "Generic (PLEG): container finished" podID="9dd296d6-e025-45bc-9149-d6595e7e683a" containerID="0b8f218137bc510a74de90b8782b57aa796b646e29b704b0a8270409fc1b6d3c" exitCode=0 Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.879152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xld25" event={"ID":"9dd296d6-e025-45bc-9149-d6595e7e683a","Type":"ContainerDied","Data":"0b8f218137bc510a74de90b8782b57aa796b646e29b704b0a8270409fc1b6d3c"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.881434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerStarted","Data":"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.881459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerStarted","Data":"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.881588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.882575 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" containerID="202531abc1663b771c9da627e2e6781d517f7d87e3c1d51ca8c45f6102bdc54f" exitCode=0 Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.882617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv26f" event={"ID":"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff","Type":"ContainerDied","Data":"202531abc1663b771c9da627e2e6781d517f7d87e3c1d51ca8c45f6102bdc54f"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.884032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" event={"ID":"01fcceaf-f66a-4144-af0e-d48af844d7aa","Type":"ContainerStarted","Data":"357f1fd81a71a94296008937931668df18b3e93161f6578fb52c11e37b4ebbf6"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.884728 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.886408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerStarted","Data":"104bf19674a4995a089c2ebf313c6aefcdc3053dd0c3dfa8937f4615466bcaeb"} Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.923266 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.923244623 podStartE2EDuration="15.923244623s" podCreationTimestamp="2026-02-25 07:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:18.918801115 +0000 UTC m=+1192.280627135" watchObservedRunningTime="2026-02-25 07:37:18.923244623 +0000 UTC m=+1192.285070653" Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.946762 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" podStartSLOduration=2.946745821 podStartE2EDuration="2.946745821s" podCreationTimestamp="2026-02-25 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:18.938970783 +0000 UTC m=+1192.300796803" watchObservedRunningTime="2026-02-25 07:37:18.946745821 +0000 UTC m=+1192.308571841" Feb 25 07:37:18 crc kubenswrapper[4749]: I0225 07:37:18.961487 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7675646668-99wj9" podStartSLOduration=2.961456448 podStartE2EDuration="2.961456448s" podCreationTimestamp="2026-02-25 07:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:18.95697099 +0000 UTC m=+1192.318797010" watchObservedRunningTime="2026-02-25 07:37:18.961456448 +0000 UTC m=+1192.323282458" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.044067 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.047180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.050673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.050673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.070892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.200830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgp5\" (UniqueName: \"kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.204644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.306529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgp5\" (UniqueName: \"kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.306826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.306857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.307012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.307081 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.307097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.307167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.313282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.316665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.318735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.325608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.325656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.330313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.330417 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgp5\" (UniqueName: \"kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5\") pod \"neutron-75648c457c-q6rdr\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.371716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.898724 4749 generic.go:334] "Generic (PLEG): container finished" podID="18514a86-67f9-4c66-b5bb-59b4a2f34627" containerID="e86715d496cbb6b72a1e08498898cae5fc9ebe0a100509ac2df6c1b23c436922" exitCode=0 Feb 25 07:37:19 crc kubenswrapper[4749]: I0225 07:37:19.898831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmwww" event={"ID":"18514a86-67f9-4c66-b5bb-59b4a2f34627","Type":"ContainerDied","Data":"e86715d496cbb6b72a1e08498898cae5fc9ebe0a100509ac2df6c1b23c436922"} Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.100124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv26f" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.100682 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xld25" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data\") pod \"9dd296d6-e025-45bc-9149-d6595e7e683a\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts\") pod \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngrm\" (UniqueName: \"kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm\") pod \"9dd296d6-e025-45bc-9149-d6595e7e683a\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle\") pod \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data\") pod \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle\") pod \"9dd296d6-e025-45bc-9149-d6595e7e683a\" (UID: \"9dd296d6-e025-45bc-9149-d6595e7e683a\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.252989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zmw\" (UniqueName: \"kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw\") pod \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.253009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs\") pod \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\" (UID: \"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff\") " Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.253634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs" (OuterVolumeSpecName: "logs") pod "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" (UID: "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.258356 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm" (OuterVolumeSpecName: "kube-api-access-hngrm") pod "9dd296d6-e025-45bc-9149-d6595e7e683a" (UID: "9dd296d6-e025-45bc-9149-d6595e7e683a"). InnerVolumeSpecName "kube-api-access-hngrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.259499 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9dd296d6-e025-45bc-9149-d6595e7e683a" (UID: "9dd296d6-e025-45bc-9149-d6595e7e683a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.260752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw" (OuterVolumeSpecName: "kube-api-access-x4zmw") pod "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" (UID: "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff"). InnerVolumeSpecName "kube-api-access-x4zmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.260976 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts" (OuterVolumeSpecName: "scripts") pod "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" (UID: "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.280571 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" (UID: "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.296872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dd296d6-e025-45bc-9149-d6595e7e683a" (UID: "9dd296d6-e025-45bc-9149-d6595e7e683a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.316727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data" (OuterVolumeSpecName: "config-data") pod "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" (UID: "f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356626 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356661 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356671 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356681 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zmw\" (UniqueName: \"kubernetes.io/projected/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-kube-api-access-x4zmw\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356692 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356701 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9dd296d6-e025-45bc-9149-d6595e7e683a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356710 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.356720 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngrm\" (UniqueName: \"kubernetes.io/projected/9dd296d6-e025-45bc-9149-d6595e7e683a-kube-api-access-hngrm\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.671278 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.671337 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.918612 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xld25" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.918896 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xld25" event={"ID":"9dd296d6-e025-45bc-9149-d6595e7e683a","Type":"ContainerDied","Data":"d605cc4f5858ca326aa857395479a2f26cf816ddc65148f463314e2b7b80a30b"} Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.919086 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d605cc4f5858ca326aa857395479a2f26cf816ddc65148f463314e2b7b80a30b" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.921415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hv26f" event={"ID":"f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff","Type":"ContainerDied","Data":"cd39873d8c0ce54ba1a1f51ee672faf6e825c8cd4e7b4e966049dfacd39f1bd5"} Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.921444 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd39873d8c0ce54ba1a1f51ee672faf6e825c8cd4e7b4e966049dfacd39f1bd5" Feb 25 07:37:21 crc kubenswrapper[4749]: I0225 07:37:21.921482 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hv26f" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.194578 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:37:22 crc kubenswrapper[4749]: E0225 07:37:22.197364 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" containerName="barbican-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.197405 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" containerName="barbican-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: E0225 07:37:22.197464 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" containerName="placement-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.197472 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" containerName="placement-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.197982 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" containerName="barbican-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.198026 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" containerName="placement-db-sync" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.199574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.213350 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.213725 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-852tl" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.213836 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.213884 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.232429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.233995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.271513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm7mh\" (UniqueName: \"kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.271579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.272416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.272451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.272476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.272582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.272624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.332512 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.333848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.339377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59plz" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.339694 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.339818 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.369266 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374072 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm7mh\" (UniqueName: \"kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.374183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.375143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.380970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.391879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.392347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.392847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.395782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm7mh\" (UniqueName: \"kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.396791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs\") pod \"placement-75c857c98-6vnbd\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.407648 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.409168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.415156 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.419185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwdj\" (UniqueName: \"kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kxxj\" (UniqueName: \"kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.476967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.479258 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.479452 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="dnsmasq-dns" containerID="cri-o://357f1fd81a71a94296008937931668df18b3e93161f6578fb52c11e37b4ebbf6" gracePeriod=10 Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.489470 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.491184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.540518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.559199 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.559704 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.561131 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.567558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.568412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwdj\" (UniqueName: \"kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf85s\" (UniqueName: \"kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kxxj\" (UniqueName: \"kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.579831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.580581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.590198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.591297 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.594064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.596121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.596987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.597179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.601733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.606446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwdj\" (UniqueName: \"kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj\") pod \"barbican-keystone-listener-55dc554858-jb4ks\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.607326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kxxj\" (UniqueName: \"kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj\") pod \"barbican-worker-5d65688c5f-x2dm8\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.664635 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x245r\" (UniqueName: \"kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf85s\" (UniqueName: \"kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.683917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.684001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.684734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.684879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.689057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.695524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.695815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.702802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf85s\" (UniqueName: \"kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s\") pod \"dnsmasq-dns-848cf88cfc-dnmcs\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.785561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.785655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.785687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x245r\" (UniqueName: \"kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.785711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.785729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.786095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.789377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.790557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.790639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.803499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x245r\" (UniqueName: \"kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r\") pod \"barbican-api-6b7b887d98-mfthr\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.821756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.848197 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.932460 4749 generic.go:334] "Generic (PLEG): container finished" podID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerID="357f1fd81a71a94296008937931668df18b3e93161f6578fb52c11e37b4ebbf6" exitCode=0 Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.932511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" event={"ID":"01fcceaf-f66a-4144-af0e-d48af844d7aa","Type":"ContainerDied","Data":"357f1fd81a71a94296008937931668df18b3e93161f6578fb52c11e37b4ebbf6"} Feb 25 07:37:22 crc kubenswrapper[4749]: I0225 07:37:22.960356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:23 crc kubenswrapper[4749]: I0225 07:37:23.944018 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 07:37:23 crc kubenswrapper[4749]: I0225 07:37:23.944059 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 07:37:23 crc kubenswrapper[4749]: I0225 07:37:23.990465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 07:37:23 crc kubenswrapper[4749]: I0225 07:37:23.999613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.231333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.231565 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.267086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.283289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.668028 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720385 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.720463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jngsx\" (UniqueName: \"kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx\") pod \"18514a86-67f9-4c66-b5bb-59b4a2f34627\" (UID: \"18514a86-67f9-4c66-b5bb-59b4a2f34627\") " Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.731535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.731851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts" (OuterVolumeSpecName: "scripts") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.732820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx" (OuterVolumeSpecName: "kube-api-access-jngsx") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "kube-api-access-jngsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.755173 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.773803 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:24 crc kubenswrapper[4749]: E0225 07:37:24.774168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18514a86-67f9-4c66-b5bb-59b4a2f34627" containerName="keystone-bootstrap" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.774185 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="18514a86-67f9-4c66-b5bb-59b4a2f34627" containerName="keystone-bootstrap" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.776220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data" (OuterVolumeSpecName: "config-data") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.778001 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="18514a86-67f9-4c66-b5bb-59b4a2f34627" containerName="keystone-bootstrap" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.778953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.779966 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18514a86-67f9-4c66-b5bb-59b4a2f34627" (UID: "18514a86-67f9-4c66-b5bb-59b4a2f34627"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.788514 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.788908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.789129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d28h4\" (UniqueName: \"kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827550 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827560 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827569 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jngsx\" (UniqueName: \"kubernetes.io/projected/18514a86-67f9-4c66-b5bb-59b4a2f34627-kube-api-access-jngsx\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827580 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827602 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.827614 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/18514a86-67f9-4c66-b5bb-59b4a2f34627-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d28h4\" (UniqueName: \"kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.929719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.934452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.935196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.935518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.935543 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.935688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.936499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.953050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d28h4\" (UniqueName: \"kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4\") pod \"barbican-api-7b9d595878-lkpsg\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kmwww" event={"ID":"18514a86-67f9-4c66-b5bb-59b4a2f34627","Type":"ContainerDied","Data":"ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7"} Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970102 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed27d04cb85c07e2f9db9222166bbceb011af428b64e12c83deb32fd75da3de7" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970146 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kmwww" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970568 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970587 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.970963 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.972584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:37:24 crc kubenswrapper[4749]: I0225 07:37:24.972698 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.069054 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.089943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.090982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtbd\" (UniqueName: \"kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.134390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc\") pod \"01fcceaf-f66a-4144-af0e-d48af844d7aa\" (UID: \"01fcceaf-f66a-4144-af0e-d48af844d7aa\") " Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.144562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd" (OuterVolumeSpecName: "kube-api-access-cbtbd") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "kube-api-access-cbtbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.221347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.222379 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.237123 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtbd\" (UniqueName: \"kubernetes.io/projected/01fcceaf-f66a-4144-af0e-d48af844d7aa-kube-api-access-cbtbd\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.237149 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.306802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.331023 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config" (OuterVolumeSpecName: "config") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.335204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.338691 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.339140 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.339205 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.345137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01fcceaf-f66a-4144-af0e-d48af844d7aa" (UID: "01fcceaf-f66a-4144-af0e-d48af844d7aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.448384 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01fcceaf-f66a-4144-af0e-d48af844d7aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.463003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.510614 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.558634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.578037 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.688633 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.698580 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:25 crc kubenswrapper[4749]: W0225 07:37:25.737296 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfd36ef0_6a4d_4333_b5b5_e5a55132dbc9.slice/crio-921b7c8b4037e6343fcb26cc28a34b3a0709f53aff6c2683b4666c50ad8e59b3 WatchSource:0}: Error finding container 921b7c8b4037e6343fcb26cc28a34b3a0709f53aff6c2683b4666c50ad8e59b3: Status 404 returned error can't find the container with id 921b7c8b4037e6343fcb26cc28a34b3a0709f53aff6c2683b4666c50ad8e59b3 Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.796140 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-567ffd99f4-495rj"] Feb 25 07:37:25 crc kubenswrapper[4749]: E0225 07:37:25.796509 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="dnsmasq-dns" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.796527 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="dnsmasq-dns" Feb 25 07:37:25 crc kubenswrapper[4749]: E0225 07:37:25.796548 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="init" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.796554 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="init" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.796773 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" containerName="dnsmasq-dns" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.797458 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.803914 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.804288 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.804439 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.804720 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.804883 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sr7tv" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.805072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.812810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-567ffd99f4-495rj"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-scripts\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-config-data\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfjc\" (UniqueName: \"kubernetes.io/projected/d2f908fa-2e8d-44bd-ac10-d745ce196bda-kube-api-access-ztfjc\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-fernet-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-public-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-internal-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-combined-ca-bundle\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.859855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-credential-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.904384 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfjc\" (UniqueName: \"kubernetes.io/projected/d2f908fa-2e8d-44bd-ac10-d745ce196bda-kube-api-access-ztfjc\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-fernet-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-public-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-internal-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-combined-ca-bundle\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-credential-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-scripts\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.962712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-config-data\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.990315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-config-data\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.992652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-public-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.992905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-credential-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.993133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-internal-tls-certs\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.994051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-combined-ca-bundle\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.994313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfjc\" (UniqueName: \"kubernetes.io/projected/d2f908fa-2e8d-44bd-ac10-d745ce196bda-kube-api-access-ztfjc\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.994753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" event={"ID":"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9","Type":"ContainerStarted","Data":"921b7c8b4037e6343fcb26cc28a34b3a0709f53aff6c2683b4666c50ad8e59b3"} Feb 25 07:37:25 crc kubenswrapper[4749]: I0225 07:37:25.995017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-fernet-keys\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:25.997903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f908fa-2e8d-44bd-ac10-d745ce196bda-scripts\") pod \"keystone-567ffd99f4-495rj\" (UID: \"d2f908fa-2e8d-44bd-ac10-d745ce196bda\") " pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.017290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" event={"ID":"01fcceaf-f66a-4144-af0e-d48af844d7aa","Type":"ContainerDied","Data":"f76f7c399a5d44456016dbc6f5910190508c1df84749a497fd503c861687e208"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.017526 4749 scope.go:117] "RemoveContainer" containerID="357f1fd81a71a94296008937931668df18b3e93161f6578fb52c11e37b4ebbf6" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.017714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qnn4j" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.035533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerStarted","Data":"33f99d7237107793204b10764c68749b1a7f025a9f51e118f348a6a6b9d6a140"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.035578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerStarted","Data":"93f752ee00398ad4fccd2bbe4f8d75bea0c32fd8a9b73a5316f5842bc09cf10b"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.044306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerStarted","Data":"a35980a262d1982c56f915a8809cc28292838bbe71807534edf190d035efd841"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.046726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerStarted","Data":"74787c233d1946ed9f588bff3ec1bf8b09de03fd92df8eee19e7c3357670969c"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.048394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerStarted","Data":"de438802159c139b9cbee9b998215d7c372b44ec5a745a3b113151dbae729466"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.049491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerStarted","Data":"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.051837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerStarted","Data":"92aec7dc3972338b038094b0e9622cccb11500cde5b828d218acbf77e2e7dce5"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.055874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerStarted","Data":"7a4800d1be793d63dfacb948e140056dd9973fea07088bc95cc7f3556a769231"} Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.073224 4749 scope.go:117] "RemoveContainer" containerID="23d38970ce46055165d4f76c0bfdfa3cf725a07d8f0067141040decc707d4e18" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.078876 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.086679 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qnn4j"] Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.128849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.184379 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5578bc7b56-qlg29"] Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.185980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.197068 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5578bc7b56-qlg29"] Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw8xr\" (UniqueName: \"kubernetes.io/projected/90ab2780-2dee-40f9-a4c6-529e08d4de0b-kube-api-access-kw8xr\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-public-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-internal-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ab2780-2dee-40f9-a4c6-529e08d4de0b-logs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-scripts\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-config-data\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.277213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-combined-ca-bundle\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw8xr\" (UniqueName: \"kubernetes.io/projected/90ab2780-2dee-40f9-a4c6-529e08d4de0b-kube-api-access-kw8xr\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-public-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-internal-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ab2780-2dee-40f9-a4c6-529e08d4de0b-logs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-scripts\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-config-data\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.380952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-combined-ca-bundle\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.386817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-scripts\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.392275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-public-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.393732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90ab2780-2dee-40f9-a4c6-529e08d4de0b-logs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.393874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-internal-tls-certs\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.398170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-config-data\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.402624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ab2780-2dee-40f9-a4c6-529e08d4de0b-combined-ca-bundle\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.409182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw8xr\" (UniqueName: \"kubernetes.io/projected/90ab2780-2dee-40f9-a4c6-529e08d4de0b-kube-api-access-kw8xr\") pod \"placement-5578bc7b56-qlg29\" (UID: \"90ab2780-2dee-40f9-a4c6-529e08d4de0b\") " pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.547113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:26 crc kubenswrapper[4749]: I0225 07:37:26.736312 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-567ffd99f4-495rj"] Feb 25 07:37:26 crc kubenswrapper[4749]: W0225 07:37:26.826295 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2f908fa_2e8d_44bd_ac10_d745ce196bda.slice/crio-619edff8ae2ffdd871b00c7c683ac616632f2c05c5edf4e643711ed532e4cb1d WatchSource:0}: Error finding container 619edff8ae2ffdd871b00c7c683ac616632f2c05c5edf4e643711ed532e4cb1d: Status 404 returned error can't find the container with id 619edff8ae2ffdd871b00c7c683ac616632f2c05c5edf4e643711ed532e4cb1d Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.100908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerStarted","Data":"2f375557899fed659b5c5675661801dae8b71c8aa8c3561dfc4c2bd49ae7c5db"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.102498 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.111228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerStarted","Data":"1d4905dc88a60a1eba304a13033b7ad68d1822dfd5b3618fd0ab4988c53e7256"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.120702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-567ffd99f4-495rj" event={"ID":"d2f908fa-2e8d-44bd-ac10-d745ce196bda","Type":"ContainerStarted","Data":"619edff8ae2ffdd871b00c7c683ac616632f2c05c5edf4e643711ed532e4cb1d"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.126333 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75648c457c-q6rdr" podStartSLOduration=9.126319526 podStartE2EDuration="9.126319526s" podCreationTimestamp="2026-02-25 07:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:27.125449955 +0000 UTC m=+1200.487275985" watchObservedRunningTime="2026-02-25 07:37:27.126319526 +0000 UTC m=+1200.488145536" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.143469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerStarted","Data":"879d94ce23cdbf332800d8b90f6d201e177283b277b1e52a332ac11dbbcd18db"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.143510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerStarted","Data":"b1287241aca3bfe61c4f30f9c25fc1e9ecd40314af97b4cf5a59bca0f342fd21"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.144451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.144479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.155769 4749 generic.go:334] "Generic (PLEG): container finished" podID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerID="d466ba914e2ddab0a640dde2609e543ddddfeb75891685a85258f22e48032f7c" exitCode=0 Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.155813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" event={"ID":"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9","Type":"ContainerDied","Data":"d466ba914e2ddab0a640dde2609e543ddddfeb75891685a85258f22e48032f7c"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.187173 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75c857c98-6vnbd" podStartSLOduration=5.187157569 podStartE2EDuration="5.187157569s" podCreationTimestamp="2026-02-25 07:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:27.161894458 +0000 UTC m=+1200.523720478" watchObservedRunningTime="2026-02-25 07:37:27.187157569 +0000 UTC m=+1200.548983589" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.202997 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.203178 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.203811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerStarted","Data":"e9529d18ff1cbb26b2c62968fee04177fa02f90ce8a9cd078bcc8b34708a27df"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.203851 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.203861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerStarted","Data":"4e21bb1c1fb03c8a00aa939aba286c9eba6ab8bb389ff152fce5d5325e12b729"} Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.203871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.254999 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b7b887d98-mfthr" podStartSLOduration=5.254978331 podStartE2EDuration="5.254978331s" podCreationTimestamp="2026-02-25 07:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:27.231464953 +0000 UTC m=+1200.593290983" watchObservedRunningTime="2026-02-25 07:37:27.254978331 +0000 UTC m=+1200.616804351" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.368780 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01fcceaf-f66a-4144-af0e-d48af844d7aa" path="/var/lib/kubelet/pods/01fcceaf-f66a-4144-af0e-d48af844d7aa/volumes" Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.501375 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5578bc7b56-qlg29"] Feb 25 07:37:27 crc kubenswrapper[4749]: I0225 07:37:27.867218 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.218784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5578bc7b56-qlg29" event={"ID":"90ab2780-2dee-40f9-a4c6-529e08d4de0b","Type":"ContainerStarted","Data":"2454524e4ab08dc5df86b4694e1e8211d4eba5ec96fc1c669f134788263ad8b6"} Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.219148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5578bc7b56-qlg29" event={"ID":"90ab2780-2dee-40f9-a4c6-529e08d4de0b","Type":"ContainerStarted","Data":"237459146ef2421c9599a2f89f81ff80430b8c824ce535118c87526739aaed4b"} Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.223318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerStarted","Data":"bfc0f2b581ec8602ef9c54e2dbd8b61eba0056db2f7dfa3db9deb0b77b268856"} Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.223411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.223456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.225995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-567ffd99f4-495rj" event={"ID":"d2f908fa-2e8d-44bd-ac10-d745ce196bda","Type":"ContainerStarted","Data":"f670d20b7476126d2986a2ae06e066b0ad75352e52d44e15fb8e019fda338453"} Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.226452 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.230005 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.230028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" event={"ID":"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9","Type":"ContainerStarted","Data":"a13f7e155e5dbbbb3a6baf9f9287d27c5a97d62ea8f896020b236ccd81689d43"} Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.245533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.266400 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b9d595878-lkpsg" podStartSLOduration=4.266383215 podStartE2EDuration="4.266383215s" podCreationTimestamp="2026-02-25 07:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:28.245650623 +0000 UTC m=+1201.607476653" watchObservedRunningTime="2026-02-25 07:37:28.266383215 +0000 UTC m=+1201.628209235" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.271900 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-567ffd99f4-495rj" podStartSLOduration=3.271888099 podStartE2EDuration="3.271888099s" podCreationTimestamp="2026-02-25 07:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:28.259083388 +0000 UTC m=+1201.620909408" watchObservedRunningTime="2026-02-25 07:37:28.271888099 +0000 UTC m=+1201.633714119" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.318054 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" podStartSLOduration=6.318035546 podStartE2EDuration="6.318035546s" podCreationTimestamp="2026-02-25 07:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:28.301434444 +0000 UTC m=+1201.663260464" watchObservedRunningTime="2026-02-25 07:37:28.318035546 +0000 UTC m=+1201.679861566" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.525642 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.525764 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 07:37:28 crc kubenswrapper[4749]: I0225 07:37:28.532482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.242032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5578bc7b56-qlg29" event={"ID":"90ab2780-2dee-40f9-a4c6-529e08d4de0b","Type":"ContainerStarted","Data":"708d068eaa22a2c6947819c54f8c13d2f945f90238ec28f7e60065e38f4135b8"} Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.242993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.243020 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.246491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lth6k" event={"ID":"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b","Type":"ContainerStarted","Data":"78b10e22e85da74a02f9a9893d84ab73c5ad25cbc3a9b93544da99c360822b50"} Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.247976 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.274385 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5578bc7b56-qlg29" podStartSLOduration=3.274297604 podStartE2EDuration="3.274297604s" podCreationTimestamp="2026-02-25 07:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:29.263190195 +0000 UTC m=+1202.625016215" watchObservedRunningTime="2026-02-25 07:37:29.274297604 +0000 UTC m=+1202.636123624" Feb 25 07:37:29 crc kubenswrapper[4749]: I0225 07:37:29.299490 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lth6k" podStartSLOduration=4.018404158 podStartE2EDuration="44.299471804s" podCreationTimestamp="2026-02-25 07:36:45 +0000 UTC" firstStartedPulling="2026-02-25 07:36:46.793195619 +0000 UTC m=+1160.155021639" lastFinishedPulling="2026-02-25 07:37:27.074263265 +0000 UTC m=+1200.436089285" observedRunningTime="2026-02-25 07:37:29.281684652 +0000 UTC m=+1202.643510672" watchObservedRunningTime="2026-02-25 07:37:29.299471804 +0000 UTC m=+1202.661297814" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.264845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerStarted","Data":"f40e517abcc0417f27c38603ff055a7c972739ef563cd6d314577c40bcdc8791"} Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.265343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerStarted","Data":"f203a4441746b52475d6b5e7df6defbd0457f2d1d29bb70b989419be807331a8"} Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.267940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerStarted","Data":"c0d0932dabf3d20b47de7bb7da8b0eea1bb1212e0074b8955e36be08c27c46c5"} Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.267977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerStarted","Data":"d11ae6b88bb5d65812ef2f531fb4200f790749c011e0ebf3f39d1f7d104adb9b"} Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.293147 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" podStartSLOduration=4.812771702 podStartE2EDuration="9.293132413s" podCreationTimestamp="2026-02-25 07:37:22 +0000 UTC" firstStartedPulling="2026-02-25 07:37:25.550584136 +0000 UTC m=+1198.912410156" lastFinishedPulling="2026-02-25 07:37:30.030944857 +0000 UTC m=+1203.392770867" observedRunningTime="2026-02-25 07:37:31.287116588 +0000 UTC m=+1204.648942608" watchObservedRunningTime="2026-02-25 07:37:31.293132413 +0000 UTC m=+1204.654958433" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.321002 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d65688c5f-x2dm8" podStartSLOduration=4.814493794 podStartE2EDuration="9.320980208s" podCreationTimestamp="2026-02-25 07:37:22 +0000 UTC" firstStartedPulling="2026-02-25 07:37:25.518777266 +0000 UTC m=+1198.880603286" lastFinishedPulling="2026-02-25 07:37:30.02526367 +0000 UTC m=+1203.387089700" observedRunningTime="2026-02-25 07:37:31.314818719 +0000 UTC m=+1204.676644739" watchObservedRunningTime="2026-02-25 07:37:31.320980208 +0000 UTC m=+1204.682806228" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.455570 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5d5c5c5846-h56g5"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.457013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.516754 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d5c5c5846-h56g5"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.533942 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-df9d99775-z76dt"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.536009 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.544110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-df9d99775-z76dt"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.600719 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.600936 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" containerID="cri-o://4e21bb1c1fb03c8a00aa939aba286c9eba6ab8bb389ff152fce5d5325e12b729" gracePeriod=30 Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.601634 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" containerID="cri-o://e9529d18ff1cbb26b2c62968fee04177fa02f90ce8a9cd078bcc8b34708a27df" gracePeriod=30 Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.627733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-combined-ca-bundle\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.627783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data-custom\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.627822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004d8426-4842-4f64-ba76-6ee6afed85de-logs\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.627846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7b9\" (UniqueName: \"kubernetes.io/projected/004d8426-4842-4f64-ba76-6ee6afed85de-kube-api-access-ms7b9\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.627865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.651024 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b57cbd48-btws7"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.652545 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.672608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b57cbd48-btws7"] Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data-custom\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9111c168-e7cf-494e-a603-93b1f9db0b73-logs\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-combined-ca-bundle\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data-custom\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004d8426-4842-4f64-ba76-6ee6afed85de-logs\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7b9\" (UniqueName: \"kubernetes.io/projected/004d8426-4842-4f64-ba76-6ee6afed85de-kube-api-access-ms7b9\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld9h5\" (UniqueName: \"kubernetes.io/projected/9111c168-e7cf-494e-a603-93b1f9db0b73-kube-api-access-ld9h5\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.729797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-combined-ca-bundle\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.731251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/004d8426-4842-4f64-ba76-6ee6afed85de-logs\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.736985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data-custom\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.745229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-config-data\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.745733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004d8426-4842-4f64-ba76-6ee6afed85de-combined-ca-bundle\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.746736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7b9\" (UniqueName: \"kubernetes.io/projected/004d8426-4842-4f64-ba76-6ee6afed85de-kube-api-access-ms7b9\") pod \"barbican-keystone-listener-5d5c5c5846-h56g5\" (UID: \"004d8426-4842-4f64-ba76-6ee6afed85de\") " pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-public-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data-custom\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9111c168-e7cf-494e-a603-93b1f9db0b73-logs\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-combined-ca-bundle\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-logs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-internal-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld9h5\" (UniqueName: \"kubernetes.io/projected/9111c168-e7cf-494e-a603-93b1f9db0b73-kube-api-access-ld9h5\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.832000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.832028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-combined-ca-bundle\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.832095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data-custom\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.832136 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttknm\" (UniqueName: \"kubernetes.io/projected/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-kube-api-access-ttknm\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.831831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9111c168-e7cf-494e-a603-93b1f9db0b73-logs\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.836827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data-custom\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.837426 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-config-data\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.837996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9111c168-e7cf-494e-a603-93b1f9db0b73-combined-ca-bundle\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.843897 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.851974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld9h5\" (UniqueName: \"kubernetes.io/projected/9111c168-e7cf-494e-a603-93b1f9db0b73-kube-api-access-ld9h5\") pod \"barbican-worker-df9d99775-z76dt\" (UID: \"9111c168-e7cf-494e-a603-93b1f9db0b73\") " pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.861533 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-df9d99775-z76dt" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-public-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-combined-ca-bundle\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-logs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-internal-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data-custom\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.934389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttknm\" (UniqueName: \"kubernetes.io/projected/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-kube-api-access-ttknm\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.935038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-logs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.937458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-combined-ca-bundle\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.939144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.939967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-public-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.940281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-internal-tls-certs\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.942957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-config-data-custom\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.958220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttknm\" (UniqueName: \"kubernetes.io/projected/35a9ddc1-5f7b-4a22-8b8f-45b895c6731c-kube-api-access-ttknm\") pod \"barbican-api-6b57cbd48-btws7\" (UID: \"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c\") " pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:31 crc kubenswrapper[4749]: I0225 07:37:31.988312 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:32 crc kubenswrapper[4749]: I0225 07:37:32.310060 4749 generic.go:334] "Generic (PLEG): container finished" podID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerID="4e21bb1c1fb03c8a00aa939aba286c9eba6ab8bb389ff152fce5d5325e12b729" exitCode=143 Feb 25 07:37:32 crc kubenswrapper[4749]: I0225 07:37:32.310782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerDied","Data":"4e21bb1c1fb03c8a00aa939aba286c9eba6ab8bb389ff152fce5d5325e12b729"} Feb 25 07:37:32 crc kubenswrapper[4749]: I0225 07:37:32.849893 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:32 crc kubenswrapper[4749]: I0225 07:37:32.947216 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:37:32 crc kubenswrapper[4749]: I0225 07:37:32.947487 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="dnsmasq-dns" containerID="cri-o://9dd488be4eba59a1bbce345ddb637a15abf53905b04b932957e4e6a123f13e64" gracePeriod=10 Feb 25 07:37:33 crc kubenswrapper[4749]: I0225 07:37:33.121333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:33 crc kubenswrapper[4749]: I0225 07:37:33.225138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:33 crc kubenswrapper[4749]: I0225 07:37:33.332605 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerID="9dd488be4eba59a1bbce345ddb637a15abf53905b04b932957e4e6a123f13e64" exitCode=0 Feb 25 07:37:33 crc kubenswrapper[4749]: I0225 07:37:33.346807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" event={"ID":"fbb3099c-45d7-45ed-ae22-724581a55bf9","Type":"ContainerDied","Data":"9dd488be4eba59a1bbce345ddb637a15abf53905b04b932957e4e6a123f13e64"} Feb 25 07:37:34 crc kubenswrapper[4749]: I0225 07:37:34.345625 4749 generic.go:334] "Generic (PLEG): container finished" podID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" containerID="78b10e22e85da74a02f9a9893d84ab73c5ad25cbc3a9b93544da99c360822b50" exitCode=0 Feb 25 07:37:34 crc kubenswrapper[4749]: I0225 07:37:34.345842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lth6k" event={"ID":"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b","Type":"ContainerDied","Data":"78b10e22e85da74a02f9a9893d84ab73c5ad25cbc3a9b93544da99c360822b50"} Feb 25 07:37:34 crc kubenswrapper[4749]: I0225 07:37:34.974366 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 25 07:37:35 crc kubenswrapper[4749]: I0225 07:37:35.089894 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75597b5c88-58jkm" podUID="43b78d96-31fe-4729-aacf-09c66c121861" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Feb 25 07:37:36 crc kubenswrapper[4749]: I0225 07:37:36.564216 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Feb 25 07:37:36 crc kubenswrapper[4749]: I0225 07:37:36.579911 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:36 crc kubenswrapper[4749]: I0225 07:37:36.687733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.002036 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46110->10.217.0.168:9311: read: connection reset by peer" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.002174 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:46122->10.217.0.168:9311: read: connection reset by peer" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.386746 4749 generic.go:334] "Generic (PLEG): container finished" podID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerID="e9529d18ff1cbb26b2c62968fee04177fa02f90ce8a9cd078bcc8b34708a27df" exitCode=0 Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.387010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerDied","Data":"e9529d18ff1cbb26b2c62968fee04177fa02f90ce8a9cd078bcc8b34708a27df"} Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.602131 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lth6k" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750863 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwbhb\" (UniqueName: \"kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.750985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data\") pod \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\" (UID: \"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b\") " Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.751706 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.770978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb" (OuterVolumeSpecName: "kube-api-access-lwbhb") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "kube-api-access-lwbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.782299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts" (OuterVolumeSpecName: "scripts") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.785606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.786715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.806916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data" (OuterVolumeSpecName: "config-data") pod "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" (UID: "8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853156 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853179 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853188 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwbhb\" (UniqueName: \"kubernetes.io/projected/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-kube-api-access-lwbhb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853198 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853206 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:37 crc kubenswrapper[4749]: I0225 07:37:37.853218 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.188348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.198149 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.361949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.361991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362050 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x245r\" (UniqueName: \"kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r\") pod \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362196 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data\") pod \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom\") pod \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle\") pod \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs\") pod \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\" (UID: \"29bd41cc-03ed-4df9-9862-c52b57a4f14e\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8j4j\" (UniqueName: \"kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.362838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb\") pod \"fbb3099c-45d7-45ed-ae22-724581a55bf9\" (UID: \"fbb3099c-45d7-45ed-ae22-724581a55bf9\") " Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.367334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs" (OuterVolumeSpecName: "logs") pod "29bd41cc-03ed-4df9-9862-c52b57a4f14e" (UID: "29bd41cc-03ed-4df9-9862-c52b57a4f14e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.367432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r" (OuterVolumeSpecName: "kube-api-access-x245r") pod "29bd41cc-03ed-4df9-9862-c52b57a4f14e" (UID: "29bd41cc-03ed-4df9-9862-c52b57a4f14e"). InnerVolumeSpecName "kube-api-access-x245r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.370065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j" (OuterVolumeSpecName: "kube-api-access-q8j4j") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "kube-api-access-q8j4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.373870 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29bd41cc-03ed-4df9-9862-c52b57a4f14e" (UID: "29bd41cc-03ed-4df9-9862-c52b57a4f14e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.403068 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lth6k" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.403516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lth6k" event={"ID":"8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b","Type":"ContainerDied","Data":"0c191011eb4a7c76c4e5a5f43d6245630c8d092d88ff5cee39fbac7f175ff610"} Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.403656 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c191011eb4a7c76c4e5a5f43d6245630c8d092d88ff5cee39fbac7f175ff610" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.405007 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29bd41cc-03ed-4df9-9862-c52b57a4f14e" (UID: "29bd41cc-03ed-4df9-9862-c52b57a4f14e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.406423 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b7b887d98-mfthr" event={"ID":"29bd41cc-03ed-4df9-9862-c52b57a4f14e","Type":"ContainerDied","Data":"de438802159c139b9cbee9b998215d7c372b44ec5a745a3b113151dbae729466"} Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.406462 4749 scope.go:117] "RemoveContainer" containerID="e9529d18ff1cbb26b2c62968fee04177fa02f90ce8a9cd078bcc8b34708a27df" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.406542 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b7b887d98-mfthr" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.415997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" event={"ID":"fbb3099c-45d7-45ed-ae22-724581a55bf9","Type":"ContainerDied","Data":"f49a8c993b4f4fa846e1412e639cbf0021e2a87c8a6bd32315d576afdc7704b1"} Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.416475 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-pf87f" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.430616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data" (OuterVolumeSpecName: "config-data") pod "29bd41cc-03ed-4df9-9862-c52b57a4f14e" (UID: "29bd41cc-03ed-4df9-9862-c52b57a4f14e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.438500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.438818 4749 scope.go:117] "RemoveContainer" containerID="4e21bb1c1fb03c8a00aa939aba286c9eba6ab8bb389ff152fce5d5325e12b729" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.446291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config" (OuterVolumeSpecName: "config") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.448020 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.452294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464421 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464444 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464454 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464462 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bd41cc-03ed-4df9-9862-c52b57a4f14e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464470 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bd41cc-03ed-4df9-9862-c52b57a4f14e-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464480 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8j4j\" (UniqueName: \"kubernetes.io/projected/fbb3099c-45d7-45ed-ae22-724581a55bf9-kube-api-access-q8j4j\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464489 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464497 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464504 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.464512 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x245r\" (UniqueName: \"kubernetes.io/projected/29bd41cc-03ed-4df9-9862-c52b57a4f14e-kube-api-access-x245r\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.472763 4749 scope.go:117] "RemoveContainer" containerID="9dd488be4eba59a1bbce345ddb637a15abf53905b04b932957e4e6a123f13e64" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.481615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fbb3099c-45d7-45ed-ae22-724581a55bf9" (UID: "fbb3099c-45d7-45ed-ae22-724581a55bf9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.496940 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.566290 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fbb3099c-45d7-45ed-ae22-724581a55bf9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.592923 4749 scope.go:117] "RemoveContainer" containerID="69b07e39d33da43a08c383b427d2a1adcbfc5a5ed8fde4cb8de87bfcd29d0337" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.599068 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-df9d99775-z76dt"] Feb 25 07:37:38 crc kubenswrapper[4749]: W0225 07:37:38.602529 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9111c168_e7cf_494e_a603_93b1f9db0b73.slice/crio-ec16a300d7cad6ec06cd44b0695e1ec4bfdff088328c78793d9596b4fc5ea430 WatchSource:0}: Error finding container ec16a300d7cad6ec06cd44b0695e1ec4bfdff088328c78793d9596b4fc5ea430: Status 404 returned error can't find the container with id ec16a300d7cad6ec06cd44b0695e1ec4bfdff088328c78793d9596b4fc5ea430 Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.740179 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b57cbd48-btws7"] Feb 25 07:37:38 crc kubenswrapper[4749]: W0225 07:37:38.763727 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a9ddc1_5f7b_4a22_8b8f_45b895c6731c.slice/crio-56be0d8c3c7908d7e144601ee5ff9021f955d9ee7b7167db6e37fe853fe596f0 WatchSource:0}: Error finding container 56be0d8c3c7908d7e144601ee5ff9021f955d9ee7b7167db6e37fe853fe596f0: Status 404 returned error can't find the container with id 56be0d8c3c7908d7e144601ee5ff9021f955d9ee7b7167db6e37fe853fe596f0 Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.844678 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.890155 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b7b887d98-mfthr"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.936288 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5d5c5c5846-h56g5"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.948428 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.963364 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-pf87f"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972121 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.972550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972568 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.972578 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972585 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.972619 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" containerName="cinder-db-sync" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972627 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" containerName="cinder-db-sync" Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.972645 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="dnsmasq-dns" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972651 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="dnsmasq-dns" Feb 25 07:37:38 crc kubenswrapper[4749]: E0225 07:37:38.972661 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="init" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="init" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972848 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972868 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972878 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" containerName="cinder-db-sync" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.972888 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" containerName="dnsmasq-dns" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.973973 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.977839 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.977908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.977985 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.978050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.978207 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-lpz5m" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.985262 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.986682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.988926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.988968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.988995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989015 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989053 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5vv\" (UniqueName: \"kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989203 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989246 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:38 crc kubenswrapper[4749]: I0225 07:37:38.989300 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5fh\" (UniqueName: \"kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:38.999784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.090556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.090659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.090706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5fh\" (UniqueName: \"kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.090740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.090778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091275 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5vv\" (UniqueName: \"kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.091958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.092439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.092506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.093326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.093663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.099619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.101194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.103163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.105940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.117066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5fh\" (UniqueName: \"kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh\") pod \"cinder-scheduler-0\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.118635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5vv\" (UniqueName: \"kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv\") pod \"dnsmasq-dns-6578955fd5-9wvml\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.171221 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.173428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.175890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.195783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.195848 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.195917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.195944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.195991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.196087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.196139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qw5\" (UniqueName: \"kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.211445 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qw5\" (UniqueName: \"kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.298856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.302409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.307489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.307496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.307965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.315179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.315262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qw5\" (UniqueName: \"kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5\") pod \"cinder-api-0\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.327774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.338824 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" path="/var/lib/kubelet/pods/29bd41cc-03ed-4df9-9862-c52b57a4f14e/volumes" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.340928 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb3099c-45d7-45ed-ae22-724581a55bf9" path="/var/lib/kubelet/pods/fbb3099c-45d7-45ed-ae22-724581a55bf9/volumes" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.454520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-df9d99775-z76dt" event={"ID":"9111c168-e7cf-494e-a603-93b1f9db0b73","Type":"ContainerStarted","Data":"61858ef78ef4471a2b9892a4cd9e7ead913645ee2eae6ad16961be5c270923e8"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.454845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-df9d99775-z76dt" event={"ID":"9111c168-e7cf-494e-a603-93b1f9db0b73","Type":"ContainerStarted","Data":"a17cbfed4d645e5309c953f5b633b962aee1c2f8d4932a152d19b476c0cdaea6"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.454856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-df9d99775-z76dt" event={"ID":"9111c168-e7cf-494e-a603-93b1f9db0b73","Type":"ContainerStarted","Data":"ec16a300d7cad6ec06cd44b0695e1ec4bfdff088328c78793d9596b4fc5ea430"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.464738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerStarted","Data":"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.464893 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="ceilometer-notification-agent" containerID="cri-o://d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.465139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.465184 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="proxy-httpd" containerID="cri-o://d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.465222 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="sg-core" containerID="cri-o://9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.473908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b57cbd48-btws7" event={"ID":"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c","Type":"ContainerStarted","Data":"5a0220be2f275b2c47c20420b14dd732dbe076e4afa76ddcf0d6c27d85f42ceb"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.473946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b57cbd48-btws7" event={"ID":"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c","Type":"ContainerStarted","Data":"96802a111262ff56441aca742f2a29386ef918467388c2ca031c45bde989bb4e"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.473956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b57cbd48-btws7" event={"ID":"35a9ddc1-5f7b-4a22-8b8f-45b895c6731c","Type":"ContainerStarted","Data":"56be0d8c3c7908d7e144601ee5ff9021f955d9ee7b7167db6e37fe853fe596f0"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.475073 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.475177 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.488395 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-df9d99775-z76dt" podStartSLOduration=8.488370858 podStartE2EDuration="8.488370858s" podCreationTimestamp="2026-02-25 07:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:39.479332528 +0000 UTC m=+1212.841158538" watchObservedRunningTime="2026-02-25 07:37:39.488370858 +0000 UTC m=+1212.850196878" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.491811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" event={"ID":"004d8426-4842-4f64-ba76-6ee6afed85de","Type":"ContainerStarted","Data":"0d06fc289c744d1066e68ca1027826d041010da3c8cdb9322d3654ca92c13b1b"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.491853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" event={"ID":"004d8426-4842-4f64-ba76-6ee6afed85de","Type":"ContainerStarted","Data":"ca10405a9ba2a45d0d8e1e6faa1037e523b1cd6d9befb5030371eeef5ebc1899"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.491869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" event={"ID":"004d8426-4842-4f64-ba76-6ee6afed85de","Type":"ContainerStarted","Data":"e4c0835a609360b24384533621dd2ab06a6a8e4c318e17f93d2a807f71ebf779"} Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.516076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.526377 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.526625 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5d65688c5f-x2dm8" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker-log" containerID="cri-o://d11ae6b88bb5d65812ef2f531fb4200f790749c011e0ebf3f39d1f7d104adb9b" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.526735 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5d65688c5f-x2dm8" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker" containerID="cri-o://c0d0932dabf3d20b47de7bb7da8b0eea1bb1212e0074b8955e36be08c27c46c5" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.530232 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b57cbd48-btws7" podStartSLOduration=8.53021523 podStartE2EDuration="8.53021523s" podCreationTimestamp="2026-02-25 07:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:39.504110139 +0000 UTC m=+1212.865936179" watchObservedRunningTime="2026-02-25 07:37:39.53021523 +0000 UTC m=+1212.892041250" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.611411 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5d5c5c5846-h56g5" podStartSLOduration=8.611393016 podStartE2EDuration="8.611393016s" podCreationTimestamp="2026-02-25 07:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:39.582256651 +0000 UTC m=+1212.944082671" watchObservedRunningTime="2026-02-25 07:37:39.611393016 +0000 UTC m=+1212.973219036" Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.627861 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.628142 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener-log" containerID="cri-o://f203a4441746b52475d6b5e7df6defbd0457f2d1d29bb70b989419be807331a8" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.628552 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener" containerID="cri-o://f40e517abcc0417f27c38603ff055a7c972739ef563cd6d314577c40bcdc8791" gracePeriod=30 Feb 25 07:37:39 crc kubenswrapper[4749]: I0225 07:37:39.861529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:39 crc kubenswrapper[4749]: W0225 07:37:39.878735 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72364f50_56ba_4cd1_8672_053c7e2103a9.slice/crio-900d7d5738fb04ae9c6848d11fe324b6e0c28c23d6301e13414393befcc19abf WatchSource:0}: Error finding container 900d7d5738fb04ae9c6848d11fe324b6e0c28c23d6301e13414393befcc19abf: Status 404 returned error can't find the container with id 900d7d5738fb04ae9c6848d11fe324b6e0c28c23d6301e13414393befcc19abf Feb 25 07:37:40 crc kubenswrapper[4749]: W0225 07:37:40.000448 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32cd89a4_78d4_4298_90b6_d854e9a35178.slice/crio-3e69e8b6780422d97d026bb5587bcfb8008557952477dd4bf63e1ed3384e3338 WatchSource:0}: Error finding container 3e69e8b6780422d97d026bb5587bcfb8008557952477dd4bf63e1ed3384e3338: Status 404 returned error can't find the container with id 3e69e8b6780422d97d026bb5587bcfb8008557952477dd4bf63e1ed3384e3338 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.014541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.096469 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.526677 4749 generic.go:334] "Generic (PLEG): container finished" podID="42b0e946-6c95-4168-99a0-74d26d717583" containerID="c0d0932dabf3d20b47de7bb7da8b0eea1bb1212e0074b8955e36be08c27c46c5" exitCode=0 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.526848 4749 generic.go:334] "Generic (PLEG): container finished" podID="42b0e946-6c95-4168-99a0-74d26d717583" containerID="d11ae6b88bb5d65812ef2f531fb4200f790749c011e0ebf3f39d1f7d104adb9b" exitCode=143 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.526767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerDied","Data":"c0d0932dabf3d20b47de7bb7da8b0eea1bb1212e0074b8955e36be08c27c46c5"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.526918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerDied","Data":"d11ae6b88bb5d65812ef2f531fb4200f790749c011e0ebf3f39d1f7d104adb9b"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.531735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerStarted","Data":"900d7d5738fb04ae9c6848d11fe324b6e0c28c23d6301e13414393befcc19abf"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.534164 4749 generic.go:334] "Generic (PLEG): container finished" podID="844c852a-ae6d-47e2-804a-ecce73c85522" containerID="f40e517abcc0417f27c38603ff055a7c972739ef563cd6d314577c40bcdc8791" exitCode=0 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.534188 4749 generic.go:334] "Generic (PLEG): container finished" podID="844c852a-ae6d-47e2-804a-ecce73c85522" containerID="f203a4441746b52475d6b5e7df6defbd0457f2d1d29bb70b989419be807331a8" exitCode=143 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.534227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerDied","Data":"f40e517abcc0417f27c38603ff055a7c972739ef563cd6d314577c40bcdc8791"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.534245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerDied","Data":"f203a4441746b52475d6b5e7df6defbd0457f2d1d29bb70b989419be807331a8"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.556565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerStarted","Data":"6c60fd45c8d2b3d8ed6465581853bc6238f81d50df17ad8ad8a280ea5ea8c906"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.564202 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerID="d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8" exitCode=0 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.564237 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerID="9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1" exitCode=2 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.564278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerDied","Data":"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.564305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerDied","Data":"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.565762 4749 generic.go:334] "Generic (PLEG): container finished" podID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerID="bedceda6e587ee5b9a0de07ac8ba3359846cbeefd382ad880bf66c7b360276ce" exitCode=0 Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.566954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" event={"ID":"32cd89a4-78d4-4298-90b6-d854e9a35178","Type":"ContainerDied","Data":"bedceda6e587ee5b9a0de07ac8ba3359846cbeefd382ad880bf66c7b360276ce"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.566979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" event={"ID":"32cd89a4-78d4-4298-90b6-d854e9a35178","Type":"ContainerStarted","Data":"3e69e8b6780422d97d026bb5587bcfb8008557952477dd4bf63e1ed3384e3338"} Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.882129 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:40 crc kubenswrapper[4749]: I0225 07:37:40.907661 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.035677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom\") pod \"42b0e946-6c95-4168-99a0-74d26d717583\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036258 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data\") pod \"42b0e946-6c95-4168-99a0-74d26d717583\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036329 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle\") pod \"42b0e946-6c95-4168-99a0-74d26d717583\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kxxj\" (UniqueName: \"kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj\") pod \"42b0e946-6c95-4168-99a0-74d26d717583\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmwdj\" (UniqueName: \"kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj\") pod \"844c852a-ae6d-47e2-804a-ecce73c85522\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data\") pod \"844c852a-ae6d-47e2-804a-ecce73c85522\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs\") pod \"844c852a-ae6d-47e2-804a-ecce73c85522\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs\") pod \"42b0e946-6c95-4168-99a0-74d26d717583\" (UID: \"42b0e946-6c95-4168-99a0-74d26d717583\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle\") pod \"844c852a-ae6d-47e2-804a-ecce73c85522\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.036741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom\") pod \"844c852a-ae6d-47e2-804a-ecce73c85522\" (UID: \"844c852a-ae6d-47e2-804a-ecce73c85522\") " Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.039301 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs" (OuterVolumeSpecName: "logs") pod "42b0e946-6c95-4168-99a0-74d26d717583" (UID: "42b0e946-6c95-4168-99a0-74d26d717583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.039746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs" (OuterVolumeSpecName: "logs") pod "844c852a-ae6d-47e2-804a-ecce73c85522" (UID: "844c852a-ae6d-47e2-804a-ecce73c85522"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.042062 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj" (OuterVolumeSpecName: "kube-api-access-bmwdj") pod "844c852a-ae6d-47e2-804a-ecce73c85522" (UID: "844c852a-ae6d-47e2-804a-ecce73c85522"). InnerVolumeSpecName "kube-api-access-bmwdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.042685 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42b0e946-6c95-4168-99a0-74d26d717583" (UID: "42b0e946-6c95-4168-99a0-74d26d717583"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.049522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "844c852a-ae6d-47e2-804a-ecce73c85522" (UID: "844c852a-ae6d-47e2-804a-ecce73c85522"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.051215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj" (OuterVolumeSpecName: "kube-api-access-2kxxj") pod "42b0e946-6c95-4168-99a0-74d26d717583" (UID: "42b0e946-6c95-4168-99a0-74d26d717583"). InnerVolumeSpecName "kube-api-access-2kxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.081747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42b0e946-6c95-4168-99a0-74d26d717583" (UID: "42b0e946-6c95-4168-99a0-74d26d717583"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.087760 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "844c852a-ae6d-47e2-804a-ecce73c85522" (UID: "844c852a-ae6d-47e2-804a-ecce73c85522"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.099505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data" (OuterVolumeSpecName: "config-data") pod "42b0e946-6c95-4168-99a0-74d26d717583" (UID: "42b0e946-6c95-4168-99a0-74d26d717583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.118243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data" (OuterVolumeSpecName: "config-data") pod "844c852a-ae6d-47e2-804a-ecce73c85522" (UID: "844c852a-ae6d-47e2-804a-ecce73c85522"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138736 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138772 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kxxj\" (UniqueName: \"kubernetes.io/projected/42b0e946-6c95-4168-99a0-74d26d717583-kube-api-access-2kxxj\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138784 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmwdj\" (UniqueName: \"kubernetes.io/projected/844c852a-ae6d-47e2-804a-ecce73c85522-kube-api-access-bmwdj\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138793 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138801 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/844c852a-ae6d-47e2-804a-ecce73c85522-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138810 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42b0e946-6c95-4168-99a0-74d26d717583-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138818 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138826 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/844c852a-ae6d-47e2-804a-ecce73c85522-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138834 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.138861 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b0e946-6c95-4168-99a0-74d26d717583-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.252015 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.579536 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.579545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-55dc554858-jb4ks" event={"ID":"844c852a-ae6d-47e2-804a-ecce73c85522","Type":"ContainerDied","Data":"92aec7dc3972338b038094b0e9622cccb11500cde5b828d218acbf77e2e7dce5"} Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.579968 4749 scope.go:117] "RemoveContainer" containerID="f40e517abcc0417f27c38603ff055a7c972739ef563cd6d314577c40bcdc8791" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.584168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerStarted","Data":"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8"} Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.585468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d65688c5f-x2dm8" event={"ID":"42b0e946-6c95-4168-99a0-74d26d717583","Type":"ContainerDied","Data":"7a4800d1be793d63dfacb948e140056dd9973fea07088bc95cc7f3556a769231"} Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.585506 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d65688c5f-x2dm8" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.587604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerStarted","Data":"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca"} Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.595705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" event={"ID":"32cd89a4-78d4-4298-90b6-d854e9a35178","Type":"ContainerStarted","Data":"b3a693c2d99e58049e6ad8692fd7d56e9bf721d8ffe37eb100f0f8d1de232899"} Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.596065 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.603627 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.615807 4749 scope.go:117] "RemoveContainer" containerID="f203a4441746b52475d6b5e7df6defbd0457f2d1d29bb70b989419be807331a8" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.618003 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-55dc554858-jb4ks"] Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.632154 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.640405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5d65688c5f-x2dm8"] Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.644212 4749 scope.go:117] "RemoveContainer" containerID="c0d0932dabf3d20b47de7bb7da8b0eea1bb1212e0074b8955e36be08c27c46c5" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.646931 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" podStartSLOduration=3.64692148 podStartE2EDuration="3.64692148s" podCreationTimestamp="2026-02-25 07:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:41.626280711 +0000 UTC m=+1214.988106731" watchObservedRunningTime="2026-02-25 07:37:41.64692148 +0000 UTC m=+1215.008747500" Feb 25 07:37:41 crc kubenswrapper[4749]: I0225 07:37:41.663741 4749 scope.go:117] "RemoveContainer" containerID="d11ae6b88bb5d65812ef2f531fb4200f790749c011e0ebf3f39d1f7d104adb9b" Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.617778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerStarted","Data":"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a"} Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.618305 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api-log" containerID="cri-o://b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" gracePeriod=30 Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.618465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.618483 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api" containerID="cri-o://61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" gracePeriod=30 Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.627629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerStarted","Data":"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97"} Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.666053 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.66602301 podStartE2EDuration="3.66602301s" podCreationTimestamp="2026-02-25 07:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:42.648435765 +0000 UTC m=+1216.010261785" watchObservedRunningTime="2026-02-25 07:37:42.66602301 +0000 UTC m=+1216.027849040" Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.752725 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.902720675 podStartE2EDuration="4.752699539s" podCreationTimestamp="2026-02-25 07:37:38 +0000 UTC" firstStartedPulling="2026-02-25 07:37:39.880289109 +0000 UTC m=+1213.242115119" lastFinishedPulling="2026-02-25 07:37:40.730267953 +0000 UTC m=+1214.092093983" observedRunningTime="2026-02-25 07:37:42.678061822 +0000 UTC m=+1216.039887842" watchObservedRunningTime="2026-02-25 07:37:42.752699539 +0000 UTC m=+1216.114525559" Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.961445 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:37:42 crc kubenswrapper[4749]: I0225 07:37:42.961500 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b7b887d98-mfthr" podUID="29bd41cc-03ed-4df9-9862-c52b57a4f14e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.334499 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b0e946-6c95-4168-99a0-74d26d717583" path="/var/lib/kubelet/pods/42b0e946-6c95-4168-99a0-74d26d717583/volumes" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.335677 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" path="/var/lib/kubelet/pods/844c852a-ae6d-47e2-804a-ecce73c85522/volumes" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.525537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.647862 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerID="61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" exitCode=0 Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.647912 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerID="b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" exitCode=143 Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.647928 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.648034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerDied","Data":"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a"} Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.648078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerDied","Data":"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8"} Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.648098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dd13c0b4-6957-47b2-9f8b-4df9760f836f","Type":"ContainerDied","Data":"6c60fd45c8d2b3d8ed6465581853bc6238f81d50df17ad8ad8a280ea5ea8c906"} Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.648124 4749 scope.go:117] "RemoveContainer" containerID="61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.670202 4749 scope.go:117] "RemoveContainer" containerID="b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.687961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688058 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qw5\" (UniqueName: \"kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id\") pod \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\" (UID: \"dd13c0b4-6957-47b2-9f8b-4df9760f836f\") " Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.688734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.689528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs" (OuterVolumeSpecName: "logs") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.691965 4749 scope.go:117] "RemoveContainer" containerID="61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" Feb 25 07:37:43 crc kubenswrapper[4749]: E0225 07:37:43.693193 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a\": container with ID starting with 61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a not found: ID does not exist" containerID="61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693240 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a"} err="failed to get container status \"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a\": rpc error: code = NotFound desc = could not find container \"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a\": container with ID starting with 61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a not found: ID does not exist" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693272 4749 scope.go:117] "RemoveContainer" containerID="b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" Feb 25 07:37:43 crc kubenswrapper[4749]: E0225 07:37:43.693657 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8\": container with ID starting with b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8 not found: ID does not exist" containerID="b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693682 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8"} err="failed to get container status \"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8\": rpc error: code = NotFound desc = could not find container \"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8\": container with ID starting with b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8 not found: ID does not exist" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693700 4749 scope.go:117] "RemoveContainer" containerID="61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693891 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a"} err="failed to get container status \"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a\": rpc error: code = NotFound desc = could not find container \"61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a\": container with ID starting with 61e52bb281a0af4038123792c1591e53f9057280522ba6185068699c8cd6852a not found: ID does not exist" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.693917 4749 scope.go:117] "RemoveContainer" containerID="b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.694089 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8"} err="failed to get container status \"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8\": rpc error: code = NotFound desc = could not find container \"b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8\": container with ID starting with b02862905b8bb7b3e94804cbbdc3acc8125e0a2dfda79f28907ddfe9f61601a8 not found: ID does not exist" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.694465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.695103 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5" (OuterVolumeSpecName: "kube-api-access-f6qw5") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "kube-api-access-f6qw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.695363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts" (OuterVolumeSpecName: "scripts") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.715557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.745611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data" (OuterVolumeSpecName: "config-data") pod "dd13c0b4-6957-47b2-9f8b-4df9760f836f" (UID: "dd13c0b4-6957-47b2-9f8b-4df9760f836f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790822 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd13c0b4-6957-47b2-9f8b-4df9760f836f-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790874 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790895 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790912 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790931 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd13c0b4-6957-47b2-9f8b-4df9760f836f-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6qw5\" (UniqueName: \"kubernetes.io/projected/dd13c0b4-6957-47b2-9f8b-4df9760f836f-kube-api-access-f6qw5\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.790967 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd13c0b4-6957-47b2-9f8b-4df9760f836f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:43 crc kubenswrapper[4749]: I0225 07:37:43.998688 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.012654 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.024974 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025420 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker-log" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025473 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025482 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api-log" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025498 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025506 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025521 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025529 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener-log" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025537 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025545 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.025569 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025576 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025794 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025816 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025827 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025841 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" containerName="cinder-api" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025855 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="844c852a-ae6d-47e2-804a-ecce73c85522" containerName="barbican-keystone-listener-log" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.025865 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b0e946-6c95-4168-99a0-74d26d717583" containerName="barbican-worker" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.026955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.029247 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.029630 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.029334 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.047287 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.199568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.199643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.199713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.199745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.199790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-scripts\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.200008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvth\" (UniqueName: \"kubernetes.io/projected/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-kube-api-access-2pvth\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.200088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data-custom\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.200152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-logs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.200245 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.301630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.301834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.301953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-scripts\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvth\" (UniqueName: \"kubernetes.io/projected/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-kube-api-access-2pvth\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.301950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data-custom\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-logs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302888 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.302978 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.303209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-logs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.306375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-public-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.306855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data-custom\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.307576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.307728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-scripts\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.310287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.319103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-config-data\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.322536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvth\" (UniqueName: \"kubernetes.io/projected/65d2cd6c-9f4d-4d9d-9032-7798a57b7aec-kube-api-access-2pvth\") pod \"cinder-api-0\" (UID: \"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec\") " pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.354427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.532672 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607449 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607858 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.607985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsmwn\" (UniqueName: \"kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn\") pod \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\" (UID: \"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce\") " Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.608358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.608497 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.608825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.613470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts" (OuterVolumeSpecName: "scripts") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.613615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn" (OuterVolumeSpecName: "kube-api-access-bsmwn") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "kube-api-access-bsmwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.632196 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.659758 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerID="d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09" exitCode=0 Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.660102 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.660934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerDied","Data":"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09"} Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.661022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ec69779f-8eea-4feb-b9e3-a9f2d2bdabce","Type":"ContainerDied","Data":"3f7ef6ae2a88ccffd9998836a9a29d9b07fdb36ab4d657829aac182d6d129bb0"} Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.661048 4749 scope.go:117] "RemoveContainer" containerID="d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.671520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.683817 4749 scope.go:117] "RemoveContainer" containerID="9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.710301 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.710334 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.710345 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.710357 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsmwn\" (UniqueName: \"kubernetes.io/projected/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-kube-api-access-bsmwn\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.710367 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.715878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data" (OuterVolumeSpecName: "config-data") pod "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" (UID: "ec69779f-8eea-4feb-b9e3-a9f2d2bdabce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.719479 4749 scope.go:117] "RemoveContainer" containerID="d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.747261 4749 scope.go:117] "RemoveContainer" containerID="d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.747836 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8\": container with ID starting with d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8 not found: ID does not exist" containerID="d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.747885 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8"} err="failed to get container status \"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8\": rpc error: code = NotFound desc = could not find container \"d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8\": container with ID starting with d37a7bca49b106eae75a318b11e16fe7d675e8ecdc8d9a1f6b4c95abb0f480e8 not found: ID does not exist" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.747915 4749 scope.go:117] "RemoveContainer" containerID="9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.748511 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1\": container with ID starting with 9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1 not found: ID does not exist" containerID="9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.748561 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1"} err="failed to get container status \"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1\": rpc error: code = NotFound desc = could not find container \"9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1\": container with ID starting with 9f282e96e4c14bb48026ba7e4def4d40801e4485faee2e16858393cfb7b4c1a1 not found: ID does not exist" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.748617 4749 scope.go:117] "RemoveContainer" containerID="d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09" Feb 25 07:37:44 crc kubenswrapper[4749]: E0225 07:37:44.749035 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09\": container with ID starting with d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09 not found: ID does not exist" containerID="d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.749067 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09"} err="failed to get container status \"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09\": rpc error: code = NotFound desc = could not find container \"d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09\": container with ID starting with d72148f5dfd50e959dcc6c563cec13b961977b924c61791bf732db5068f78c09 not found: ID does not exist" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.812684 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:44 crc kubenswrapper[4749]: I0225 07:37:44.814376 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.060214 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.071754 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.101843 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: E0225 07:37:45.104532 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="sg-core" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.104564 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="sg-core" Feb 25 07:37:45 crc kubenswrapper[4749]: E0225 07:37:45.104639 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="ceilometer-notification-agent" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.104648 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="ceilometer-notification-agent" Feb 25 07:37:45 crc kubenswrapper[4749]: E0225 07:37:45.104665 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="proxy-httpd" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.104672 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="proxy-httpd" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.105329 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="proxy-httpd" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.105368 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="sg-core" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.105376 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" containerName="ceilometer-notification-agent" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.112397 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.114573 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.114758 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.117097 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: E0225 07:37:45.134937 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec69779f_8eea_4feb_b9e3_a9f2d2bdabce.slice/crio-3f7ef6ae2a88ccffd9998836a9a29d9b07fdb36ab4d657829aac182d6d129bb0\": RecentStats: unable to find data in memory cache]" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.219801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.220262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsc4c\" (UniqueName: \"kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsc4c\" (UniqueName: \"kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.321923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.322844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.324875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.325634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.327108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.327385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.328194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.337829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsc4c\" (UniqueName: \"kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c\") pod \"ceilometer-0\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.338660 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd13c0b4-6957-47b2-9f8b-4df9760f836f" path="/var/lib/kubelet/pods/dd13c0b4-6957-47b2-9f8b-4df9760f836f/volumes" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.339417 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec69779f-8eea-4feb-b9e3-a9f2d2bdabce" path="/var/lib/kubelet/pods/ec69779f-8eea-4feb-b9e3-a9f2d2bdabce/volumes" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.437771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.702130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec","Type":"ContainerStarted","Data":"2153950d71c40bb7767dc6b941b2e8ebac397c4dc26c4f0eb7be5e93c76d327f"} Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.702188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec","Type":"ContainerStarted","Data":"ff88a696cca9b815590ffd664a53fd6788c80493124d2104e1a104c73335e1d7"} Feb 25 07:37:45 crc kubenswrapper[4749]: I0225 07:37:45.904133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:37:45 crc kubenswrapper[4749]: W0225 07:37:45.914351 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf2e978_07de_4a4d_9e7f_576500a7b8d4.slice/crio-3654eb4bae11f3d4976a40af06ced7cf5680f8c1918f66e0860dcf2a6ffd86b8 WatchSource:0}: Error finding container 3654eb4bae11f3d4976a40af06ced7cf5680f8c1918f66e0860dcf2a6ffd86b8: Status 404 returned error can't find the container with id 3654eb4bae11f3d4976a40af06ced7cf5680f8c1918f66e0860dcf2a6ffd86b8 Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.716470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerStarted","Data":"b461ee5b0915e317802ce677020a7444678c39b138bbe667024e314382f94379"} Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.717038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerStarted","Data":"3654eb4bae11f3d4976a40af06ced7cf5680f8c1918f66e0860dcf2a6ffd86b8"} Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.717876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"65d2cd6c-9f4d-4d9d-9032-7798a57b7aec","Type":"ContainerStarted","Data":"795687cbc527bf2c397fa52cdee6b2011c0334ae519e0fe603b8ba3f3d90deb8"} Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.718038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.736304 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.736288509 podStartE2EDuration="3.736288509s" podCreationTimestamp="2026-02-25 07:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:46.735623794 +0000 UTC m=+1220.097449814" watchObservedRunningTime="2026-02-25 07:37:46.736288509 +0000 UTC m=+1220.098114529" Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.774018 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.840280 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:37:46 crc kubenswrapper[4749]: I0225 07:37:46.854482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.006143 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.006424 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75648c457c-q6rdr" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-api" containerID="cri-o://33f99d7237107793204b10764c68749b1a7f025a9f51e118f348a6a6b9d6a140" gracePeriod=30 Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.006559 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75648c457c-q6rdr" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" containerID="cri-o://2f375557899fed659b5c5675661801dae8b71c8aa8c3561dfc4c2bd49ae7c5db" gracePeriod=30 Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.023233 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-75648c457c-q6rdr" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": EOF" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.033435 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-668b654645-4xlm5"] Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.034925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.058789 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-668b654645-4xlm5"] Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.155900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-httpd-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nph5w\" (UniqueName: \"kubernetes.io/projected/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-kube-api-access-nph5w\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-internal-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156417 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-ovndb-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-combined-ca-bundle\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.156532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-public-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-httpd-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nph5w\" (UniqueName: \"kubernetes.io/projected/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-kube-api-access-nph5w\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-internal-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-ovndb-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.257970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-combined-ca-bundle\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.258001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-public-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.264112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-public-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.268846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-httpd-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.269432 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-config\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.269576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-internal-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.270584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-ovndb-tls-certs\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.273123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-combined-ca-bundle\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.302578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nph5w\" (UniqueName: \"kubernetes.io/projected/8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6-kube-api-access-nph5w\") pod \"neutron-668b654645-4xlm5\" (UID: \"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6\") " pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.355158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.734819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerStarted","Data":"bc42a8762d916b1076342972c2099b541968b69e37585f1a2f8590b5f6db1d51"} Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.736972 4749 generic.go:334] "Generic (PLEG): container finished" podID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerID="2f375557899fed659b5c5675661801dae8b71c8aa8c3561dfc4c2bd49ae7c5db" exitCode=0 Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.737041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerDied","Data":"2f375557899fed659b5c5675661801dae8b71c8aa8c3561dfc4c2bd49ae7c5db"} Feb 25 07:37:47 crc kubenswrapper[4749]: I0225 07:37:47.995409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-668b654645-4xlm5"] Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.598544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.711003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.745778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668b654645-4xlm5" event={"ID":"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6","Type":"ContainerStarted","Data":"cf1f08d15e0063f35d42ae12fd52171e8621d9992744d71515aa8b472162de9c"} Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.745820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668b654645-4xlm5" event={"ID":"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6","Type":"ContainerStarted","Data":"4abb59757e426fb1da270bd02cef696a816f96a30d6e9de9d16b0296dfef8cd2"} Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.745832 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-668b654645-4xlm5" event={"ID":"8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6","Type":"ContainerStarted","Data":"a4d2b3e604a67d28ee98b7b6847e66d4a824667dd80bf1fef7ea82657bff61aa"} Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.747505 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.748095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerStarted","Data":"036f25c6596874dae355483665d3d78761afbee98a2cae620d7105a615fced75"} Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.775238 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-668b654645-4xlm5" podStartSLOduration=1.7752177059999998 podStartE2EDuration="1.775217706s" podCreationTimestamp="2026-02-25 07:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:48.769869467 +0000 UTC m=+1222.131695487" watchObservedRunningTime="2026-02-25 07:37:48.775217706 +0000 UTC m=+1222.137043716" Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.818109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75597b5c88-58jkm" Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.937303 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.937766 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon-log" containerID="cri-o://4d8d6ff5f005eda3d7bac28c9113d2df900871e30469621e74315136f4865155" gracePeriod=30 Feb 25 07:37:48 crc kubenswrapper[4749]: I0225 07:37:48.938127 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" containerID="cri-o://269cf05610217219000c47b8623644aac4dfeb65efba19487d9ef0872efbf736" gracePeriod=30 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.310270 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b57cbd48-btws7" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.349180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.374860 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-75648c457c-q6rdr" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9696/\": dial tcp 10.217.0.163:9696: connect: connection refused" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.403677 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.404057 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b9d595878-lkpsg" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api-log" containerID="cri-o://1d4905dc88a60a1eba304a13033b7ad68d1822dfd5b3618fd0ab4988c53e7256" gracePeriod=30 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.404561 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b9d595878-lkpsg" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api" containerID="cri-o://bfc0f2b581ec8602ef9c54e2dbd8b61eba0056db2f7dfa3db9deb0b77b268856" gracePeriod=30 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.443341 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.443575 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="dnsmasq-dns" containerID="cri-o://a13f7e155e5dbbbb3a6baf9f9287d27c5a97d62ea8f896020b236ccd81689d43" gracePeriod=10 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.687057 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.784038 4749 generic.go:334] "Generic (PLEG): container finished" podID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerID="a13f7e155e5dbbbb3a6baf9f9287d27c5a97d62ea8f896020b236ccd81689d43" exitCode=0 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.784164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" event={"ID":"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9","Type":"ContainerDied","Data":"a13f7e155e5dbbbb3a6baf9f9287d27c5a97d62ea8f896020b236ccd81689d43"} Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.788811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerStarted","Data":"c1d45bf15cb2fd5fe20ee7a87e69185037dfa16656b99fa73813bfe3dcd0199b"} Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.789489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.791054 4749 generic.go:334] "Generic (PLEG): container finished" podID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerID="1d4905dc88a60a1eba304a13033b7ad68d1822dfd5b3618fd0ab4988c53e7256" exitCode=143 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.791271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerDied","Data":"1d4905dc88a60a1eba304a13033b7ad68d1822dfd5b3618fd0ab4988c53e7256"} Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.798095 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.798632 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="probe" containerID="cri-o://74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97" gracePeriod=30 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.798608 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="cinder-scheduler" containerID="cri-o://433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca" gracePeriod=30 Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.821286 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.406281309 podStartE2EDuration="4.821266439s" podCreationTimestamp="2026-02-25 07:37:45 +0000 UTC" firstStartedPulling="2026-02-25 07:37:45.919421518 +0000 UTC m=+1219.281247568" lastFinishedPulling="2026-02-25 07:37:49.334406638 +0000 UTC m=+1222.696232698" observedRunningTime="2026-02-25 07:37:49.811682587 +0000 UTC m=+1223.173508597" watchObservedRunningTime="2026-02-25 07:37:49.821266439 +0000 UTC m=+1223.183092459" Feb 25 07:37:49 crc kubenswrapper[4749]: I0225 07:37:49.975174 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.068432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.068539 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf85s\" (UniqueName: \"kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.068560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.068582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.068990 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.069024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0\") pod \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\" (UID: \"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9\") " Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.088714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s" (OuterVolumeSpecName: "kube-api-access-gf85s") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "kube-api-access-gf85s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.121883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.133498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.147980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config" (OuterVolumeSpecName: "config") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.155880 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.156833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" (UID: "cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.172957 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.173029 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.173042 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.173051 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf85s\" (UniqueName: \"kubernetes.io/projected/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-kube-api-access-gf85s\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.173061 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.173069 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.811649 4749 generic.go:334] "Generic (PLEG): container finished" podID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerID="74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97" exitCode=0 Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.811975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerDied","Data":"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97"} Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.814818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" event={"ID":"cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9","Type":"ContainerDied","Data":"921b7c8b4037e6343fcb26cc28a34b3a0709f53aff6c2683b4666c50ad8e59b3"} Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.814927 4749 scope.go:117] "RemoveContainer" containerID="a13f7e155e5dbbbb3a6baf9f9287d27c5a97d62ea8f896020b236ccd81689d43" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.814937 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-dnmcs" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.854269 4749 scope.go:117] "RemoveContainer" containerID="d466ba914e2ddab0a640dde2609e543ddddfeb75891685a85258f22e48032f7c" Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.864346 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:50 crc kubenswrapper[4749]: I0225 07:37:50.876805 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-dnmcs"] Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.356161 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" path="/var/lib/kubelet/pods/cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9/volumes" Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.671375 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.671798 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.671856 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.672899 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.673057 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6" gracePeriod=600 Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.830883 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6" exitCode=0 Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.830949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6"} Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.830986 4749 scope.go:117] "RemoveContainer" containerID="83827623785bdf0bc92f9aff72fe55166395c8a0d081648fc5edcfc2b5fffc65" Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.835097 4749 generic.go:334] "Generic (PLEG): container finished" podID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerID="33f99d7237107793204b10764c68749b1a7f025a9f51e118f348a6a6b9d6a140" exitCode=0 Feb 25 07:37:51 crc kubenswrapper[4749]: I0225 07:37:51.835128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerDied","Data":"33f99d7237107793204b10764c68749b1a7f025a9f51e118f348a6a6b9d6a140"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.151381 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.277344 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.328613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.330941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.331148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.331165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.331489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.331524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdgp5\" (UniqueName: \"kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.331622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs\") pod \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\" (UID: \"33cca359-81f7-4ecb-a2a9-a8eb64180be6\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.338527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5" (OuterVolumeSpecName: "kube-api-access-zdgp5") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "kube-api-access-zdgp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.342247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.388105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.391901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.404481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config" (OuterVolumeSpecName: "config") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.420861 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432487 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432670 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5fh\" (UniqueName: \"kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.432996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle\") pod \"72364f50-56ba-4cd1-8672-053c7e2103a9\" (UID: \"72364f50-56ba-4cd1-8672-053c7e2103a9\") " Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.435696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh" (OuterVolumeSpecName: "kube-api-access-6x5fh") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "kube-api-access-6x5fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.435953 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.435980 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436011 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdgp5\" (UniqueName: \"kubernetes.io/projected/33cca359-81f7-4ecb-a2a9-a8eb64180be6-kube-api-access-zdgp5\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436023 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436032 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436041 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436050 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72364f50-56ba-4cd1-8672-053c7e2103a9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.436623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "33cca359-81f7-4ecb-a2a9-a8eb64180be6" (UID: "33cca359-81f7-4ecb-a2a9-a8eb64180be6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.437137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts" (OuterVolumeSpecName: "scripts") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.439067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.495047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537373 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33cca359-81f7-4ecb-a2a9-a8eb64180be6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537404 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x5fh\" (UniqueName: \"kubernetes.io/projected/72364f50-56ba-4cd1-8672-053c7e2103a9-kube-api-access-6x5fh\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537416 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537424 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537434 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.537842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data" (OuterVolumeSpecName: "config-data") pod "72364f50-56ba-4cd1-8672-053c7e2103a9" (UID: "72364f50-56ba-4cd1-8672-053c7e2103a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.595053 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b9d595878-lkpsg" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:46742->10.217.0.169:9311: read: connection reset by peer" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.595054 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b9d595878-lkpsg" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:46740->10.217.0.169:9311: read: connection reset by peer" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.639385 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72364f50-56ba-4cd1-8672-053c7e2103a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.870420 4749 generic.go:334] "Generic (PLEG): container finished" podID="54c92c50-880d-4bf0-823f-16f13d25066b" containerID="269cf05610217219000c47b8623644aac4dfeb65efba19487d9ef0872efbf736" exitCode=0 Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.870842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerDied","Data":"269cf05610217219000c47b8623644aac4dfeb65efba19487d9ef0872efbf736"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.875954 4749 generic.go:334] "Generic (PLEG): container finished" podID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerID="bfc0f2b581ec8602ef9c54e2dbd8b61eba0056db2f7dfa3db9deb0b77b268856" exitCode=0 Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.876022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerDied","Data":"bfc0f2b581ec8602ef9c54e2dbd8b61eba0056db2f7dfa3db9deb0b77b268856"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.878013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.880898 4749 generic.go:334] "Generic (PLEG): container finished" podID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerID="433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca" exitCode=0 Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.881027 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.881099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerDied","Data":"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.881157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"72364f50-56ba-4cd1-8672-053c7e2103a9","Type":"ContainerDied","Data":"900d7d5738fb04ae9c6848d11fe324b6e0c28c23d6301e13414393befcc19abf"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.881184 4749 scope.go:117] "RemoveContainer" containerID="74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.899147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75648c457c-q6rdr" event={"ID":"33cca359-81f7-4ecb-a2a9-a8eb64180be6","Type":"ContainerDied","Data":"93f752ee00398ad4fccd2bbe4f8d75bea0c32fd8a9b73a5316f5842bc09cf10b"} Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.899266 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75648c457c-q6rdr" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.972395 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.973573 4749 scope.go:117] "RemoveContainer" containerID="433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.998494 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.999324 4749 scope.go:117] "RemoveContainer" containerID="74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97" Feb 25 07:37:52 crc kubenswrapper[4749]: E0225 07:37:52.999862 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97\": container with ID starting with 74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97 not found: ID does not exist" containerID="74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.999931 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97"} err="failed to get container status \"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97\": rpc error: code = NotFound desc = could not find container \"74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97\": container with ID starting with 74584d9980a3db5cd44c9718101c0e909cafc7d21f323b1f7a5926d801186b97 not found: ID does not exist" Feb 25 07:37:52 crc kubenswrapper[4749]: I0225 07:37:52.999960 4749 scope.go:117] "RemoveContainer" containerID="433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.000269 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca\": container with ID starting with 433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca not found: ID does not exist" containerID="433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.000288 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca"} err="failed to get container status \"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca\": rpc error: code = NotFound desc = could not find container \"433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca\": container with ID starting with 433fba1ba94a1e052aa370075bcd1b5792a1fb9d1672eedc383b675361962cca not found: ID does not exist" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.000314 4749 scope.go:117] "RemoveContainer" containerID="2f375557899fed659b5c5675661801dae8b71c8aa8c3561dfc4c2bd49ae7c5db" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.047760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d28h4\" (UniqueName: \"kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.048828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle\") pod \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\" (UID: \"c59e484e-ac00-41d4-bd3f-709c2d4ae55b\") " Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.104893 4749 scope.go:117] "RemoveContainer" containerID="33f99d7237107793204b10764c68749b1a7f025a9f51e118f348a6a6b9d6a140" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.052750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs" (OuterVolumeSpecName: "logs") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.110965 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.105785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4" (OuterVolumeSpecName: "kube-api-access-d28h4") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "kube-api-access-d28h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.110840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.113581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.134656 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="dnsmasq-dns" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135111 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="dnsmasq-dns" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135126 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135133 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135152 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135161 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="cinder-scheduler" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135167 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="cinder-scheduler" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135181 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="probe" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135187 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="probe" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135202 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api-log" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135208 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api-log" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135221 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="init" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135226 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="init" Feb 25 07:37:53 crc kubenswrapper[4749]: E0225 07:37:53.135236 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-api" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135241 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-api" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135411 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-httpd" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135424 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd36ef0-6a4d-4333-b5b5-e5a55132dbc9" containerName="dnsmasq-dns" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135435 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api-log" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135445 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" containerName="neutron-api" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135466 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="probe" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135481 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" containerName="cinder-scheduler" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.135490 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" containerName="barbican-api" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.136428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.155012 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.156766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.166450 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.175630 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75648c457c-q6rdr"] Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.183624 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.191917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data" (OuterVolumeSpecName: "config-data") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.202693 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c59e484e-ac00-41d4-bd3f-709c2d4ae55b" (UID: "c59e484e-ac00-41d4-bd3f-709c2d4ae55b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpd7l\" (UniqueName: \"kubernetes.io/projected/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-kube-api-access-rpd7l\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206287 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206298 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206310 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206319 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206328 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206338 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d28h4\" (UniqueName: \"kubernetes.io/projected/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-kube-api-access-d28h4\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.206347 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59e484e-ac00-41d4-bd3f-709c2d4ae55b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpd7l\" (UniqueName: \"kubernetes.io/projected/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-kube-api-access-rpd7l\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.308536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.317491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.317497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-scripts\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.317546 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.317842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-config-data\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.327656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpd7l\" (UniqueName: \"kubernetes.io/projected/b6ba75dd-bf4c-4d7e-88b3-cf11679c231a-kube-api-access-rpd7l\") pod \"cinder-scheduler-0\" (UID: \"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a\") " pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.331453 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33cca359-81f7-4ecb-a2a9-a8eb64180be6" path="/var/lib/kubelet/pods/33cca359-81f7-4ecb-a2a9-a8eb64180be6/volumes" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.332172 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72364f50-56ba-4cd1-8672-053c7e2103a9" path="/var/lib/kubelet/pods/72364f50-56ba-4cd1-8672-053c7e2103a9/volumes" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.470921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.735312 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.791036 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.922910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b9d595878-lkpsg" event={"ID":"c59e484e-ac00-41d4-bd3f-709c2d4ae55b","Type":"ContainerDied","Data":"a35980a262d1982c56f915a8809cc28292838bbe71807534edf190d035efd841"} Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.922967 4749 scope.go:117] "RemoveContainer" containerID="bfc0f2b581ec8602ef9c54e2dbd8b61eba0056db2f7dfa3db9deb0b77b268856" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.923015 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b9d595878-lkpsg" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.959048 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.959125 4749 scope.go:117] "RemoveContainer" containerID="1d4905dc88a60a1eba304a13033b7ad68d1822dfd5b3618fd0ab4988c53e7256" Feb 25 07:37:53 crc kubenswrapper[4749]: I0225 07:37:53.976395 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b9d595878-lkpsg"] Feb 25 07:37:54 crc kubenswrapper[4749]: I0225 07:37:54.000121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 25 07:37:54 crc kubenswrapper[4749]: W0225 07:37:54.005100 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6ba75dd_bf4c_4d7e_88b3_cf11679c231a.slice/crio-8c7e8356bd3ab0d7d6d39b51e4d323a1ee527827b0021c381e72ee9b5e560a6a WatchSource:0}: Error finding container 8c7e8356bd3ab0d7d6d39b51e4d323a1ee527827b0021c381e72ee9b5e560a6a: Status 404 returned error can't find the container with id 8c7e8356bd3ab0d7d6d39b51e4d323a1ee527827b0021c381e72ee9b5e560a6a Feb 25 07:37:54 crc kubenswrapper[4749]: I0225 07:37:54.936934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a","Type":"ContainerStarted","Data":"ebc443b5b81f573a503f1bc994459ed3d999379b20cf84a24947a674dcca206e"} Feb 25 07:37:54 crc kubenswrapper[4749]: I0225 07:37:54.937379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a","Type":"ContainerStarted","Data":"8c7e8356bd3ab0d7d6d39b51e4d323a1ee527827b0021c381e72ee9b5e560a6a"} Feb 25 07:37:54 crc kubenswrapper[4749]: I0225 07:37:54.972665 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 25 07:37:55 crc kubenswrapper[4749]: I0225 07:37:55.347722 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c59e484e-ac00-41d4-bd3f-709c2d4ae55b" path="/var/lib/kubelet/pods/c59e484e-ac00-41d4-bd3f-709c2d4ae55b/volumes" Feb 25 07:37:55 crc kubenswrapper[4749]: I0225 07:37:55.956197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b6ba75dd-bf4c-4d7e-88b3-cf11679c231a","Type":"ContainerStarted","Data":"349534ea612fac060a5552cdf1aac1bd6e13269ad94ba40719b5fe7a7f55313c"} Feb 25 07:37:55 crc kubenswrapper[4749]: I0225 07:37:55.992176 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.992155628 podStartE2EDuration="3.992155628s" podCreationTimestamp="2026-02-25 07:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:37:55.983621242 +0000 UTC m=+1229.345447302" watchObservedRunningTime="2026-02-25 07:37:55.992155628 +0000 UTC m=+1229.353981658" Feb 25 07:37:56 crc kubenswrapper[4749]: I0225 07:37:56.357834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.527049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.542636 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5578bc7b56-qlg29" Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.652330 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.652575 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75c857c98-6vnbd" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-log" containerID="cri-o://b1287241aca3bfe61c4f30f9c25fc1e9ecd40314af97b4cf5a59bca0f342fd21" gracePeriod=30 Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.652937 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75c857c98-6vnbd" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-api" containerID="cri-o://879d94ce23cdbf332800d8b90f6d201e177283b277b1e52a332ac11dbbcd18db" gracePeriod=30 Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.973668 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-567ffd99f4-495rj" Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.974990 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5562f33-e36a-46f0-bb68-b132c7064252" containerID="b1287241aca3bfe61c4f30f9c25fc1e9ecd40314af97b4cf5a59bca0f342fd21" exitCode=143 Feb 25 07:37:57 crc kubenswrapper[4749]: I0225 07:37:57.975072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerDied","Data":"b1287241aca3bfe61c4f30f9c25fc1e9ecd40314af97b4cf5a59bca0f342fd21"} Feb 25 07:37:58 crc kubenswrapper[4749]: I0225 07:37:58.471905 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.130300 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533418-qwhq2"] Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.131920 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.135939 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.136005 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.136058 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.150414 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533418-qwhq2"] Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.241539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzhr\" (UniqueName: \"kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr\") pod \"auto-csr-approver-29533418-qwhq2\" (UID: \"90a0afce-d86a-422c-8a74-b2e2554837d8\") " pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.343633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzhr\" (UniqueName: \"kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr\") pod \"auto-csr-approver-29533418-qwhq2\" (UID: \"90a0afce-d86a-422c-8a74-b2e2554837d8\") " pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.363556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzhr\" (UniqueName: \"kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr\") pod \"auto-csr-approver-29533418-qwhq2\" (UID: \"90a0afce-d86a-422c-8a74-b2e2554837d8\") " pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.451347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:00 crc kubenswrapper[4749]: I0225 07:38:00.952063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533418-qwhq2"] Feb 25 07:38:00 crc kubenswrapper[4749]: W0225 07:38:00.961391 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a0afce_d86a_422c_8a74_b2e2554837d8.slice/crio-e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa WatchSource:0}: Error finding container e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa: Status 404 returned error can't find the container with id e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.007471 4749 generic.go:334] "Generic (PLEG): container finished" podID="e5562f33-e36a-46f0-bb68-b132c7064252" containerID="879d94ce23cdbf332800d8b90f6d201e177283b277b1e52a332ac11dbbcd18db" exitCode=0 Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.007559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerDied","Data":"879d94ce23cdbf332800d8b90f6d201e177283b277b1e52a332ac11dbbcd18db"} Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.010609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" event={"ID":"90a0afce-d86a-422c-8a74-b2e2554837d8","Type":"ContainerStarted","Data":"e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa"} Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.150180 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.158799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.158895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.158937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.158970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.158998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm7mh\" (UniqueName: \"kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.159039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.159094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts\") pod \"e5562f33-e36a-46f0-bb68-b132c7064252\" (UID: \"e5562f33-e36a-46f0-bb68-b132c7064252\") " Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.159960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs" (OuterVolumeSpecName: "logs") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.164112 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts" (OuterVolumeSpecName: "scripts") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.186757 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh" (OuterVolumeSpecName: "kube-api-access-mm7mh") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "kube-api-access-mm7mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.219558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.261220 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.261249 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.261260 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5562f33-e36a-46f0-bb68-b132c7064252-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.261269 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm7mh\" (UniqueName: \"kubernetes.io/projected/e5562f33-e36a-46f0-bb68-b132c7064252-kube-api-access-mm7mh\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.262338 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data" (OuterVolumeSpecName: "config-data") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.298743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.300955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5562f33-e36a-46f0-bb68-b132c7064252" (UID: "e5562f33-e36a-46f0-bb68-b132c7064252"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.362815 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.362848 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.362862 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5562f33-e36a-46f0-bb68-b132c7064252-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.494181 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: E0225 07:38:01.494812 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-api" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.494824 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-api" Feb 25 07:38:01 crc kubenswrapper[4749]: E0225 07:38:01.494862 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-log" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.494869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-log" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.495028 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-api" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.495061 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" containerName="placement-log" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.495612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.500118 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lxqrd" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.500228 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.500202 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.509507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.566625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zslpj\" (UniqueName: \"kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.567001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.567054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.567140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.668315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.668360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.668385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.668450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zslpj\" (UniqueName: \"kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.669369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.673468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.692017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.694829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zslpj\" (UniqueName: \"kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj\") pod \"openstackclient\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.809900 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.840156 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.846435 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.947897 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.949372 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.959887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:01 crc kubenswrapper[4749]: E0225 07:38:01.965193 4749 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 25 07:38:01 crc kubenswrapper[4749]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_719e01de-2a6e-45ae-b58e-8be506b6643b_0(f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3" Netns:"/var/run/netns/8662aed3-a802-414e-bb60-076ebccc2e02" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3;K8S_POD_UID=719e01de-2a6e-45ae-b58e-8be506b6643b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/719e01de-2a6e-45ae-b58e-8be506b6643b]: expected pod UID "719e01de-2a6e-45ae-b58e-8be506b6643b" but got "a7c13290-8c43-443a-8563-ea54a96c975a" from Kube API Feb 25 07:38:01 crc kubenswrapper[4749]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 07:38:01 crc kubenswrapper[4749]: > Feb 25 07:38:01 crc kubenswrapper[4749]: E0225 07:38:01.965259 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 25 07:38:01 crc kubenswrapper[4749]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_719e01de-2a6e-45ae-b58e-8be506b6643b_0(f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3" Netns:"/var/run/netns/8662aed3-a802-414e-bb60-076ebccc2e02" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f5e30c13611d2ca436fb96df95cc02e126a04404e36c23d39e33a890b258c9e3;K8S_POD_UID=719e01de-2a6e-45ae-b58e-8be506b6643b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/719e01de-2a6e-45ae-b58e-8be506b6643b]: expected pod UID "719e01de-2a6e-45ae-b58e-8be506b6643b" but got "a7c13290-8c43-443a-8563-ea54a96c975a" from Kube API Feb 25 07:38:01 crc kubenswrapper[4749]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 25 07:38:01 crc kubenswrapper[4749]: > pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.981648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvddl\" (UniqueName: \"kubernetes.io/projected/a7c13290-8c43-443a-8563-ea54a96c975a-kube-api-access-xvddl\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.981712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.981768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:01 crc kubenswrapper[4749]: I0225 07:38:01.981805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.025502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75c857c98-6vnbd" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.025972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75c857c98-6vnbd" event={"ID":"e5562f33-e36a-46f0-bb68-b132c7064252","Type":"ContainerDied","Data":"74787c233d1946ed9f588bff3ec1bf8b09de03fd92df8eee19e7c3357670969c"} Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.026029 4749 scope.go:117] "RemoveContainer" containerID="879d94ce23cdbf332800d8b90f6d201e177283b277b1e52a332ac11dbbcd18db" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.027425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.032982 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="719e01de-2a6e-45ae-b58e-8be506b6643b" podUID="a7c13290-8c43-443a-8563-ea54a96c975a" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.047052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.052278 4749 scope.go:117] "RemoveContainer" containerID="b1287241aca3bfe61c4f30f9c25fc1e9ecd40314af97b4cf5a59bca0f342fd21" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.055843 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.062664 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-75c857c98-6vnbd"] Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.082930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle\") pod \"719e01de-2a6e-45ae-b58e-8be506b6643b\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config\") pod \"719e01de-2a6e-45ae-b58e-8be506b6643b\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zslpj\" (UniqueName: \"kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj\") pod \"719e01de-2a6e-45ae-b58e-8be506b6643b\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvddl\" (UniqueName: \"kubernetes.io/projected/a7c13290-8c43-443a-8563-ea54a96c975a-kube-api-access-xvddl\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.083439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.084174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "719e01de-2a6e-45ae-b58e-8be506b6643b" (UID: "719e01de-2a6e-45ae-b58e-8be506b6643b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.084332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.092450 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj" (OuterVolumeSpecName: "kube-api-access-zslpj") pod "719e01de-2a6e-45ae-b58e-8be506b6643b" (UID: "719e01de-2a6e-45ae-b58e-8be506b6643b"). InnerVolumeSpecName "kube-api-access-zslpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.093581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.093581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7c13290-8c43-443a-8563-ea54a96c975a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.093695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719e01de-2a6e-45ae-b58e-8be506b6643b" (UID: "719e01de-2a6e-45ae-b58e-8be506b6643b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.103539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvddl\" (UniqueName: \"kubernetes.io/projected/a7c13290-8c43-443a-8563-ea54a96c975a-kube-api-access-xvddl\") pod \"openstackclient\" (UID: \"a7c13290-8c43-443a-8563-ea54a96c975a\") " pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.184173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret\") pod \"719e01de-2a6e-45ae-b58e-8be506b6643b\" (UID: \"719e01de-2a6e-45ae-b58e-8be506b6643b\") " Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.184505 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.184520 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.184530 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zslpj\" (UniqueName: \"kubernetes.io/projected/719e01de-2a6e-45ae-b58e-8be506b6643b-kube-api-access-zslpj\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.187863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "719e01de-2a6e-45ae-b58e-8be506b6643b" (UID: "719e01de-2a6e-45ae-b58e-8be506b6643b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.270437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.286523 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/719e01de-2a6e-45ae-b58e-8be506b6643b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:02 crc kubenswrapper[4749]: I0225 07:38:02.716879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.046831 4749 generic.go:334] "Generic (PLEG): container finished" podID="90a0afce-d86a-422c-8a74-b2e2554837d8" containerID="74908daed5d334da53f77583af35385d537b66a1f7b29501a945650e80bc0195" exitCode=0 Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.046900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" event={"ID":"90a0afce-d86a-422c-8a74-b2e2554837d8","Type":"ContainerDied","Data":"74908daed5d334da53f77583af35385d537b66a1f7b29501a945650e80bc0195"} Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.051569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7c13290-8c43-443a-8563-ea54a96c975a","Type":"ContainerStarted","Data":"94ae81e2c91edba7bb9627edb75c1aafe9aef198420f5cdd886f684ad8dbb0e4"} Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.051580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.072623 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="719e01de-2a6e-45ae-b58e-8be506b6643b" podUID="a7c13290-8c43-443a-8563-ea54a96c975a" Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.336974 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719e01de-2a6e-45ae-b58e-8be506b6643b" path="/var/lib/kubelet/pods/719e01de-2a6e-45ae-b58e-8be506b6643b/volumes" Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.337546 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5562f33-e36a-46f0-bb68-b132c7064252" path="/var/lib/kubelet/pods/e5562f33-e36a-46f0-bb68-b132c7064252/volumes" Feb 25 07:38:03 crc kubenswrapper[4749]: I0225 07:38:03.686425 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 25 07:38:04 crc kubenswrapper[4749]: I0225 07:38:04.453356 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:04 crc kubenswrapper[4749]: I0225 07:38:04.624644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzhr\" (UniqueName: \"kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr\") pod \"90a0afce-d86a-422c-8a74-b2e2554837d8\" (UID: \"90a0afce-d86a-422c-8a74-b2e2554837d8\") " Feb 25 07:38:04 crc kubenswrapper[4749]: I0225 07:38:04.633708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr" (OuterVolumeSpecName: "kube-api-access-2fzhr") pod "90a0afce-d86a-422c-8a74-b2e2554837d8" (UID: "90a0afce-d86a-422c-8a74-b2e2554837d8"). InnerVolumeSpecName "kube-api-access-2fzhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:04 crc kubenswrapper[4749]: I0225 07:38:04.727350 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzhr\" (UniqueName: \"kubernetes.io/projected/90a0afce-d86a-422c-8a74-b2e2554837d8-kube-api-access-2fzhr\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:04 crc kubenswrapper[4749]: I0225 07:38:04.973257 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.074005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" event={"ID":"90a0afce-d86a-422c-8a74-b2e2554837d8","Type":"ContainerDied","Data":"e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa"} Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.074046 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27624c0731a90052299fddbcabb378c658206810f469e0291897ae084a09aaa" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.074576 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533418-qwhq2" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.306584 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6846d6d889-85shz"] Feb 25 07:38:05 crc kubenswrapper[4749]: E0225 07:38:05.306936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a0afce-d86a-422c-8a74-b2e2554837d8" containerName="oc" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.306954 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a0afce-d86a-422c-8a74-b2e2554837d8" containerName="oc" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.307127 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a0afce-d86a-422c-8a74-b2e2554837d8" containerName="oc" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.308407 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.313518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.314038 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.314254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.357904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-etc-swift\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt6hh\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-kube-api-access-rt6hh\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-run-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-combined-ca-bundle\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-internal-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-log-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-public-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.358829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-config-data\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.361144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6846d6d889-85shz"] Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462098 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt6hh\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-kube-api-access-rt6hh\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-run-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-combined-ca-bundle\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-internal-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462207 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-log-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-public-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-config-data\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.462332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-etc-swift\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.463387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-log-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.463671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-run-httpd\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.467403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-config-data\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.467940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-public-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.472538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-etc-swift\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.474719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-combined-ca-bundle\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.488975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt6hh\" (UniqueName: \"kubernetes.io/projected/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-kube-api-access-rt6hh\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.509167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8a7476-3a62-4af3-a5bb-8a4b8d60108f-internal-tls-certs\") pod \"swift-proxy-6846d6d889-85shz\" (UID: \"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f\") " pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.523681 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533412-gwcm2"] Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.547239 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533412-gwcm2"] Feb 25 07:38:05 crc kubenswrapper[4749]: I0225 07:38:05.668319 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.211652 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.212197 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-central-agent" containerID="cri-o://b461ee5b0915e317802ce677020a7444678c39b138bbe667024e314382f94379" gracePeriod=30 Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.212878 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="proxy-httpd" containerID="cri-o://c1d45bf15cb2fd5fe20ee7a87e69185037dfa16656b99fa73813bfe3dcd0199b" gracePeriod=30 Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.212936 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="sg-core" containerID="cri-o://036f25c6596874dae355483665d3d78761afbee98a2cae620d7105a615fced75" gracePeriod=30 Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.212964 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-notification-agent" containerID="cri-o://bc42a8762d916b1076342972c2099b541968b69e37585f1a2f8590b5f6db1d51" gracePeriod=30 Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.222655 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.179:3000/\": EOF" Feb 25 07:38:06 crc kubenswrapper[4749]: I0225 07:38:06.242530 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6846d6d889-85shz"] Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.095560 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerID="c1d45bf15cb2fd5fe20ee7a87e69185037dfa16656b99fa73813bfe3dcd0199b" exitCode=0 Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.096035 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerID="036f25c6596874dae355483665d3d78761afbee98a2cae620d7105a615fced75" exitCode=2 Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.096044 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerID="b461ee5b0915e317802ce677020a7444678c39b138bbe667024e314382f94379" exitCode=0 Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.096077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerDied","Data":"c1d45bf15cb2fd5fe20ee7a87e69185037dfa16656b99fa73813bfe3dcd0199b"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.096101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerDied","Data":"036f25c6596874dae355483665d3d78761afbee98a2cae620d7105a615fced75"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.096110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerDied","Data":"b461ee5b0915e317802ce677020a7444678c39b138bbe667024e314382f94379"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.097753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6846d6d889-85shz" event={"ID":"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f","Type":"ContainerStarted","Data":"d148a61b5f1375bb0cad48086ae7bcbbc3f8b64179aea4da64e17fcdb9e8899d"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.097778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6846d6d889-85shz" event={"ID":"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f","Type":"ContainerStarted","Data":"585788c327e7c7ba7d136bd81e4e3412f0d17df8f91233a848e7886793df268d"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.097788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6846d6d889-85shz" event={"ID":"ff8a7476-3a62-4af3-a5bb-8a4b8d60108f","Type":"ContainerStarted","Data":"799227e09a0df7155ee4a413c716951933081c1ef323a430f802bd74f6e29951"} Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.098538 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.098567 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.121634 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6846d6d889-85shz" podStartSLOduration=2.121613577 podStartE2EDuration="2.121613577s" podCreationTimestamp="2026-02-25 07:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:07.114417019 +0000 UTC m=+1240.476243049" watchObservedRunningTime="2026-02-25 07:38:07.121613577 +0000 UTC m=+1240.483439587" Feb 25 07:38:07 crc kubenswrapper[4749]: I0225 07:38:07.355421 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73ed139-bbb8-49c0-a53d-b09855ded2ea" path="/var/lib/kubelet/pods/a73ed139-bbb8-49c0-a53d-b09855ded2ea/volumes" Feb 25 07:38:10 crc kubenswrapper[4749]: I0225 07:38:10.129346 4749 generic.go:334] "Generic (PLEG): container finished" podID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerID="bc42a8762d916b1076342972c2099b541968b69e37585f1a2f8590b5f6db1d51" exitCode=0 Feb 25 07:38:10 crc kubenswrapper[4749]: I0225 07:38:10.129534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerDied","Data":"bc42a8762d916b1076342972c2099b541968b69e37585f1a2f8590b5f6db1d51"} Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.479159 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.558065 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.558277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-log" containerID="cri-o://6ecdc80088d758d8064604d275b68591330c03c74e584f8523b3ef0bfb16f8eb" gracePeriod=30 Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.558678 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-httpd" containerID="cri-o://ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331" gracePeriod=30 Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611278 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611632 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsc4c\" (UniqueName: \"kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611711 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.611825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle\") pod \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\" (UID: \"dbf2e978-07de-4a4d-9e7f-576500a7b8d4\") " Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.612193 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.612323 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.615802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.615952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c" (OuterVolumeSpecName: "kube-api-access-fsc4c") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "kube-api-access-fsc4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.619656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts" (OuterVolumeSpecName: "scripts") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.637371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.702656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.709270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data" (OuterVolumeSpecName: "config-data") pod "dbf2e978-07de-4a4d-9e7f-576500a7b8d4" (UID: "dbf2e978-07de-4a4d-9e7f-576500a7b8d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714021 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714042 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsc4c\" (UniqueName: \"kubernetes.io/projected/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-kube-api-access-fsc4c\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714052 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714060 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714067 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:12 crc kubenswrapper[4749]: I0225 07:38:12.714075 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2e978-07de-4a4d-9e7f-576500a7b8d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.158920 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.158922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dbf2e978-07de-4a4d-9e7f-576500a7b8d4","Type":"ContainerDied","Data":"3654eb4bae11f3d4976a40af06ced7cf5680f8c1918f66e0860dcf2a6ffd86b8"} Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.159079 4749 scope.go:117] "RemoveContainer" containerID="c1d45bf15cb2fd5fe20ee7a87e69185037dfa16656b99fa73813bfe3dcd0199b" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.161153 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerID="6ecdc80088d758d8064604d275b68591330c03c74e584f8523b3ef0bfb16f8eb" exitCode=143 Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.161203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerDied","Data":"6ecdc80088d758d8064604d275b68591330c03c74e584f8523b3ef0bfb16f8eb"} Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.162742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7c13290-8c43-443a-8563-ea54a96c975a","Type":"ContainerStarted","Data":"bec7d7b1550096bf7b7d580e99d7d2e892518a5adc469de8a8d6d1ff03f49eb2"} Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.177005 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.177220 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-log" containerID="cri-o://e3018e6638b8d524d26767e09f5a4fdf417056f3b37a3e63abc914f28fb8ff5e" gracePeriod=30 Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.177343 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-httpd" containerID="cri-o://104bf19674a4995a089c2ebf313c6aefcdc3053dd0c3dfa8937f4615466bcaeb" gracePeriod=30 Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.236202 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.690020848 podStartE2EDuration="12.236186956s" podCreationTimestamp="2026-02-25 07:38:01 +0000 UTC" firstStartedPulling="2026-02-25 07:38:02.718607222 +0000 UTC m=+1236.080433252" lastFinishedPulling="2026-02-25 07:38:12.26477334 +0000 UTC m=+1245.626599360" observedRunningTime="2026-02-25 07:38:13.22261849 +0000 UTC m=+1246.584444510" watchObservedRunningTime="2026-02-25 07:38:13.236186956 +0000 UTC m=+1246.598012976" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.270933 4749 scope.go:117] "RemoveContainer" containerID="036f25c6596874dae355483665d3d78761afbee98a2cae620d7105a615fced75" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.285179 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.301241 4749 scope.go:117] "RemoveContainer" containerID="bc42a8762d916b1076342972c2099b541968b69e37585f1a2f8590b5f6db1d51" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.304273 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.333982 4749 scope.go:117] "RemoveContainer" containerID="b461ee5b0915e317802ce677020a7444678c39b138bbe667024e314382f94379" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.335649 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" path="/var/lib/kubelet/pods/dbf2e978-07de-4a4d-9e7f-576500a7b8d4/volumes" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336344 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:13 crc kubenswrapper[4749]: E0225 07:38:13.336659 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="sg-core" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336675 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="sg-core" Feb 25 07:38:13 crc kubenswrapper[4749]: E0225 07:38:13.336684 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-notification-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-notification-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: E0225 07:38:13.336704 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-central-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336710 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-central-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: E0225 07:38:13.336727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="proxy-httpd" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336733 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="proxy-httpd" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336903 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="proxy-httpd" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336917 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-central-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336930 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="sg-core" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.336937 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf2e978-07de-4a4d-9e7f-576500a7b8d4" containerName="ceilometer-notification-agent" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.338689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.344392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.345344 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.359724 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrc5\" (UniqueName: \"kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.427922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.530065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.531143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.531653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.531745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.531878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrc5\" (UniqueName: \"kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.531953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.532032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.532439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.532859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.535692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.536025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.536491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.537849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.551979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrc5\" (UniqueName: \"kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5\") pod \"ceilometer-0\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " pod="openstack/ceilometer-0" Feb 25 07:38:13 crc kubenswrapper[4749]: I0225 07:38:13.730672 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:14 crc kubenswrapper[4749]: I0225 07:38:14.171773 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerID="e3018e6638b8d524d26767e09f5a4fdf417056f3b37a3e63abc914f28fb8ff5e" exitCode=143 Feb 25 07:38:14 crc kubenswrapper[4749]: I0225 07:38:14.171857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerDied","Data":"e3018e6638b8d524d26767e09f5a4fdf417056f3b37a3e63abc914f28fb8ff5e"} Feb 25 07:38:14 crc kubenswrapper[4749]: I0225 07:38:14.218031 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:14 crc kubenswrapper[4749]: I0225 07:38:14.972961 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6955447f7b-mp5f9" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Feb 25 07:38:14 crc kubenswrapper[4749]: I0225 07:38:14.973677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.185626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerStarted","Data":"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb"} Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.185695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerStarted","Data":"d510ef39910af563fe1d0d39509c77ec8c7704098857de411802003316da1b1a"} Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.430268 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7jw8g"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.432291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.444192 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7jw8g"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.514057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xbgmb"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.515562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.534927 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xbgmb"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.568793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfvv\" (UniqueName: \"kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.568886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.616985 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7d49-account-create-update-2q4zh"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.618443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.622905 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7d49-account-create-update-2q4zh"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.624904 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.670160 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf4xh\" (UniqueName: \"kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.670219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfvv\" (UniqueName: \"kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.670268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.670311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.671435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.685276 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.691039 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6846d6d889-85shz" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.704173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfvv\" (UniqueName: \"kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv\") pod \"nova-api-db-create-7jw8g\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.771557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf4xh\" (UniqueName: \"kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.771628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gms9p\" (UniqueName: \"kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.771679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.771704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.772671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.792686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf4xh\" (UniqueName: \"kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh\") pod \"nova-cell0-db-create-xbgmb\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.831857 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7gjg9"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.832965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.835917 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.842076 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c449-account-create-update-rc7pl"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.843254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.844877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.851074 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.851199 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7gjg9"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.861555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c449-account-create-update-rc7pl"] Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.873124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.873333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gms9p\" (UniqueName: \"kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.875398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.892448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gms9p\" (UniqueName: \"kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p\") pod \"nova-api-7d49-account-create-update-2q4zh\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.942144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.974258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg9cw\" (UniqueName: \"kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.974323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9fzb\" (UniqueName: \"kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.974379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:15 crc kubenswrapper[4749]: I0225 07:38:15.974463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:15 crc kubenswrapper[4749]: E0225 07:38:15.992437 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e7b8924_5f8a_4292_87d7_ddc30d845858.slice/crio-ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e7b8924_5f8a_4292_87d7_ddc30d845858.slice/crio-conmon-ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331.scope\": RecentStats: unable to find data in memory cache]" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.024008 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-08a2-account-create-update-ng7wt"] Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.025528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.027481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.038404 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-08a2-account-create-update-ng7wt"] Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.076051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.076159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.076181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg9cw\" (UniqueName: \"kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.076221 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9fzb\" (UniqueName: \"kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.077088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.077527 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.097425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9fzb\" (UniqueName: \"kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb\") pod \"nova-cell0-c449-account-create-update-rc7pl\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.100772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg9cw\" (UniqueName: \"kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw\") pod \"nova-cell1-db-create-7gjg9\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.179256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.179389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87rk\" (UniqueName: \"kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.220884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.227409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.245029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerStarted","Data":"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c"} Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.263704 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerID="ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331" exitCode=0 Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.264885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerDied","Data":"ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331"} Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.281766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87rk\" (UniqueName: \"kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.282167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.283098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.307514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87rk\" (UniqueName: \"kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk\") pod \"nova-cell1-08a2-account-create-update-ng7wt\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.324949 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.367515 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497658 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5z9r\" (UniqueName: \"kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.497955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run\") pod \"5e7b8924-5f8a-4292-87d7-ddc30d845858\" (UID: \"5e7b8924-5f8a-4292-87d7-ddc30d845858\") " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.498990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs" (OuterVolumeSpecName: "logs") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.509857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.519087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r" (OuterVolumeSpecName: "kube-api-access-m5z9r") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "kube-api-access-m5z9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.552775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts" (OuterVolumeSpecName: "scripts") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.567931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.601325 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.601374 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.601385 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5z9r\" (UniqueName: \"kubernetes.io/projected/5e7b8924-5f8a-4292-87d7-ddc30d845858-kube-api-access-m5z9r\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.601395 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e7b8924-5f8a-4292-87d7-ddc30d845858-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.601403 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.675407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7jw8g"] Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.679768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.693029 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.702654 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xbgmb"] Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.703837 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.703852 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.752555 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7d49-account-create-update-2q4zh"] Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.769371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.775691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data" (OuterVolumeSpecName: "config-data") pod "5e7b8924-5f8a-4292-87d7-ddc30d845858" (UID: "5e7b8924-5f8a-4292-87d7-ddc30d845858"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.806527 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:16 crc kubenswrapper[4749]: I0225 07:38:16.806559 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e7b8924-5f8a-4292-87d7-ddc30d845858-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.070434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7gjg9"] Feb 25 07:38:17 crc kubenswrapper[4749]: W0225 07:38:17.123566 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4b6bff5_789e_4d3f_bbe3_f5c52e4fe406.slice/crio-142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319 WatchSource:0}: Error finding container 142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319: Status 404 returned error can't find the container with id 142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319 Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.123933 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c449-account-create-update-rc7pl"] Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.213787 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-08a2-account-create-update-ng7wt"] Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.304447 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerID="104bf19674a4995a089c2ebf313c6aefcdc3053dd0c3dfa8937f4615466bcaeb" exitCode=0 Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.304515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerDied","Data":"104bf19674a4995a089c2ebf313c6aefcdc3053dd0c3dfa8937f4615466bcaeb"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.336979 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.348753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7gjg9" event={"ID":"46bc241f-11af-4075-b898-060df752a179","Type":"ContainerStarted","Data":"fe2bda90df2d0745b8b46aef039eb990bc18fa0f303933d99ce5acdda64aa4a4"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.348821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7gjg9" event={"ID":"46bc241f-11af-4075-b898-060df752a179","Type":"ContainerStarted","Data":"31954dcde22916c3017324c9670b4f03b41f3c598dcc6caccb638ac69516d00f"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.348844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5e7b8924-5f8a-4292-87d7-ddc30d845858","Type":"ContainerDied","Data":"dc4d0acb478851e6d5351b9cc3294410c73e844b905ec09032978ffc93fe4a7a"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.348857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7d49-account-create-update-2q4zh" event={"ID":"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f","Type":"ContainerStarted","Data":"798752ffa2b19f79a2a814a2fec00dcc5ac69798af35f5feb24faac6367f319b"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.348866 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7d49-account-create-update-2q4zh" event={"ID":"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f","Type":"ContainerStarted","Data":"1e7074eaac38f9563d95342137187909105e2e77cee740420bdcaca4a7214aa2"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.349637 4749 scope.go:117] "RemoveContainer" containerID="ab32345e59e1802a9e1a0754307e48b0f07762dcbd2f974c2298495f5b74d331" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.356129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" event={"ID":"8184cd81-b73e-4815-8980-c117ecbddedb","Type":"ContainerStarted","Data":"cc62d71d14eaf98511d4e14c194a91edc6866122becf19a57c9afd74158119c0"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.371048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerStarted","Data":"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.384388 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-668b654645-4xlm5" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.387770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7jw8g" event={"ID":"2385f9b1-5956-4820-a03b-b9a7892c2e93","Type":"ContainerStarted","Data":"4a42adb1053b219dae8e3dca59d461530bd0a92fbf109abc5db7a14c1205307a"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.387789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7jw8g" event={"ID":"2385f9b1-5956-4820-a03b-b9a7892c2e93","Type":"ContainerStarted","Data":"250f05563a0613e57f39426b93c744d153d763588c1b6b28397ee61887d80474"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.401724 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.409057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xbgmb" event={"ID":"6538c07b-e233-452f-adeb-4a91300817de","Type":"ContainerStarted","Data":"77586455684cc5d95908b1a9e4ec84950605b8ef4a2bccee1009dca9afe33fb9"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.409103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xbgmb" event={"ID":"6538c07b-e233-452f-adeb-4a91300817de","Type":"ContainerStarted","Data":"5a0142eeaffc8b9c1606fdb915079f0687422f286e367f8b37cbf4ebd45fdb9a"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.411864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" event={"ID":"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406","Type":"ContainerStarted","Data":"142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319"} Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.519492 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-7d49-account-create-update-2q4zh" podStartSLOduration=2.519464761 podStartE2EDuration="2.519464761s" podCreationTimestamp="2026-02-25 07:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:17.511152337 +0000 UTC m=+1250.872978357" watchObservedRunningTime="2026-02-25 07:38:17.519464761 +0000 UTC m=+1250.881290781" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.521710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.521871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.521897 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.521938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.521968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.522023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5rnm\" (UniqueName: \"kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.522053 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.522072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data\") pod \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\" (UID: \"6f87a021-8e9a-42f4-9fd5-1779d4c39e36\") " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.522637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.523632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs" (OuterVolumeSpecName: "logs") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.532654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.535908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm" (OuterVolumeSpecName: "kube-api-access-c5rnm") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "kube-api-access-c5rnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.551992 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts" (OuterVolumeSpecName: "scripts") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.588463 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-7gjg9" podStartSLOduration=2.588444717 podStartE2EDuration="2.588444717s" podCreationTimestamp="2026-02-25 07:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:17.575557747 +0000 UTC m=+1250.937383767" watchObservedRunningTime="2026-02-25 07:38:17.588444717 +0000 UTC m=+1250.950270737" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.601068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626358 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626414 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626426 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626441 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626451 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5rnm\" (UniqueName: \"kubernetes.io/projected/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-kube-api-access-c5rnm\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.626462 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.687883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data" (OuterVolumeSpecName: "config-data") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.690707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6f87a021-8e9a-42f4-9fd5-1779d4c39e36" (UID: "6f87a021-8e9a-42f4-9fd5-1779d4c39e36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.701016 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" podStartSLOduration=2.701001189 podStartE2EDuration="2.701001189s" podCreationTimestamp="2026-02-25 07:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:17.658497899 +0000 UTC m=+1251.020323919" watchObservedRunningTime="2026-02-25 07:38:17.701001189 +0000 UTC m=+1251.062827209" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.703893 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.726959 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.727511 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7675646668-99wj9" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-httpd" containerID="cri-o://61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073" gracePeriod=30 Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.727757 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7675646668-99wj9" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-api" containerID="cri-o://560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d" gracePeriod=30 Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.731081 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.731110 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.731120 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f87a021-8e9a-42f4-9fd5-1779d4c39e36-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.868092 4749 scope.go:117] "RemoveContainer" containerID="6ecdc80088d758d8064604d275b68591330c03c74e584f8523b3ef0bfb16f8eb" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.900496 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.935668 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.959437 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:17 crc kubenswrapper[4749]: E0225 07:38:17.959864 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.959878 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: E0225 07:38:17.959892 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.959898 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: E0225 07:38:17.959918 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.959924 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: E0225 07:38:17.959943 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.959949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.960114 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.960127 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.960142 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" containerName="glance-log" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.960154 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" containerName="glance-httpd" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.961124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.966010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.966155 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 25 07:38:17 crc kubenswrapper[4749]: I0225 07:38:17.970377 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fnrp\" (UniqueName: \"kubernetes.io/projected/64d42008-7546-4307-9953-37a51af1df8a-kube-api-access-9fnrp\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.036354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.111364 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.137818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.137871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.137896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.137963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.137983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.138004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.138037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fnrp\" (UniqueName: \"kubernetes.io/projected/64d42008-7546-4307-9953-37a51af1df8a-kube-api-access-9fnrp\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.138063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.141389 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.141468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.141508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64d42008-7546-4307-9953-37a51af1df8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.143203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.143726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.145821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.147522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d42008-7546-4307-9953-37a51af1df8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.159784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fnrp\" (UniqueName: \"kubernetes.io/projected/64d42008-7546-4307-9953-37a51af1df8a-kube-api-access-9fnrp\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.167184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"64d42008-7546-4307-9953-37a51af1df8a\") " pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.287405 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.431059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f87a021-8e9a-42f4-9fd5-1779d4c39e36","Type":"ContainerDied","Data":"a8fadc4888e188860b266e79522b2d80458c0417ffe4c4080d67ad7ebc843fee"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.431315 4749 scope.go:117] "RemoveContainer" containerID="104bf19674a4995a089c2ebf313c6aefcdc3053dd0c3dfa8937f4615466bcaeb" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.431092 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.436912 4749 generic.go:334] "Generic (PLEG): container finished" podID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerID="61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.436968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerDied","Data":"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.445531 4749 generic.go:334] "Generic (PLEG): container finished" podID="6538c07b-e233-452f-adeb-4a91300817de" containerID="77586455684cc5d95908b1a9e4ec84950605b8ef4a2bccee1009dca9afe33fb9" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.445639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xbgmb" event={"ID":"6538c07b-e233-452f-adeb-4a91300817de","Type":"ContainerDied","Data":"77586455684cc5d95908b1a9e4ec84950605b8ef4a2bccee1009dca9afe33fb9"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.453996 4749 generic.go:334] "Generic (PLEG): container finished" podID="b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" containerID="798752ffa2b19f79a2a814a2fec00dcc5ac69798af35f5feb24faac6367f319b" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.454054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7d49-account-create-update-2q4zh" event={"ID":"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f","Type":"ContainerDied","Data":"798752ffa2b19f79a2a814a2fec00dcc5ac69798af35f5feb24faac6367f319b"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.455630 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" containerID="1c8f76209c44fb02dd1aef9d22bd8fe14eaee0205b0649039d4ea8b01f13672f" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.455744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" event={"ID":"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406","Type":"ContainerDied","Data":"1c8f76209c44fb02dd1aef9d22bd8fe14eaee0205b0649039d4ea8b01f13672f"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.457832 4749 generic.go:334] "Generic (PLEG): container finished" podID="8184cd81-b73e-4815-8980-c117ecbddedb" containerID="c8b89ebe4cda7eaf8a75165f9d9a1deaa05de4b7e647ebb6c8c4c82d7dce31a9" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.457886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" event={"ID":"8184cd81-b73e-4815-8980-c117ecbddedb","Type":"ContainerDied","Data":"c8b89ebe4cda7eaf8a75165f9d9a1deaa05de4b7e647ebb6c8c4c82d7dce31a9"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.465131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerStarted","Data":"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.465311 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.466266 4749 generic.go:334] "Generic (PLEG): container finished" podID="46bc241f-11af-4075-b898-060df752a179" containerID="fe2bda90df2d0745b8b46aef039eb990bc18fa0f303933d99ce5acdda64aa4a4" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.466307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7gjg9" event={"ID":"46bc241f-11af-4075-b898-060df752a179","Type":"ContainerDied","Data":"fe2bda90df2d0745b8b46aef039eb990bc18fa0f303933d99ce5acdda64aa4a4"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.467621 4749 generic.go:334] "Generic (PLEG): container finished" podID="2385f9b1-5956-4820-a03b-b9a7892c2e93" containerID="4a42adb1053b219dae8e3dca59d461530bd0a92fbf109abc5db7a14c1205307a" exitCode=0 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.467665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7jw8g" event={"ID":"2385f9b1-5956-4820-a03b-b9a7892c2e93","Type":"ContainerDied","Data":"4a42adb1053b219dae8e3dca59d461530bd0a92fbf109abc5db7a14c1205307a"} Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.467830 4749 scope.go:117] "RemoveContainer" containerID="e3018e6638b8d524d26767e09f5a4fdf417056f3b37a3e63abc914f28fb8ff5e" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.513322 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.549076 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.564716 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.566033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.570200 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.570203 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.589454 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.627352 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.999943786 podStartE2EDuration="5.627336735s" podCreationTimestamp="2026-02-25 07:38:13 +0000 UTC" firstStartedPulling="2026-02-25 07:38:14.21827473 +0000 UTC m=+1247.580100740" lastFinishedPulling="2026-02-25 07:38:17.845667669 +0000 UTC m=+1251.207493689" observedRunningTime="2026-02-25 07:38:18.587040026 +0000 UTC m=+1251.948866036" watchObservedRunningTime="2026-02-25 07:38:18.627336735 +0000 UTC m=+1251.989162755" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.646856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.646919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-logs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.646942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.646980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.647006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.647041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgnjm\" (UniqueName: \"kubernetes.io/projected/d3957039-d105-44aa-865d-08cf1bd562bf-kube-api-access-pgnjm\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.647081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.647099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.749879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-logs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-logs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3957039-d105-44aa-865d-08cf1bd562bf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgnjm\" (UniqueName: \"kubernetes.io/projected/d3957039-d105-44aa-865d-08cf1bd562bf-kube-api-access-pgnjm\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.750908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.752310 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.769388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.770067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.771313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-config-data\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.772938 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3957039-d105-44aa-865d-08cf1bd562bf-scripts\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.778275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgnjm\" (UniqueName: \"kubernetes.io/projected/d3957039-d105-44aa-865d-08cf1bd562bf-kube-api-access-pgnjm\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.822730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d3957039-d105-44aa-865d-08cf1bd562bf\") " pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.897377 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.931507 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 25 07:38:18 crc kubenswrapper[4749]: W0225 07:38:18.934846 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d42008_7546_4307_9953_37a51af1df8a.slice/crio-ecdb857f503edce43d9b356e8ba491f29b531b484df52e05fb467be874f5bb24 WatchSource:0}: Error finding container ecdb857f503edce43d9b356e8ba491f29b531b484df52e05fb467be874f5bb24: Status 404 returned error can't find the container with id ecdb857f503edce43d9b356e8ba491f29b531b484df52e05fb467be874f5bb24 Feb 25 07:38:18 crc kubenswrapper[4749]: I0225 07:38:18.935179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.001809 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.065704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts\") pod \"2385f9b1-5956-4820-a03b-b9a7892c2e93\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.065783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts\") pod \"6538c07b-e233-452f-adeb-4a91300817de\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.065919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf4xh\" (UniqueName: \"kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh\") pod \"6538c07b-e233-452f-adeb-4a91300817de\" (UID: \"6538c07b-e233-452f-adeb-4a91300817de\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.065971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbfvv\" (UniqueName: \"kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv\") pod \"2385f9b1-5956-4820-a03b-b9a7892c2e93\" (UID: \"2385f9b1-5956-4820-a03b-b9a7892c2e93\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.069613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6538c07b-e233-452f-adeb-4a91300817de" (UID: "6538c07b-e233-452f-adeb-4a91300817de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.069936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2385f9b1-5956-4820-a03b-b9a7892c2e93" (UID: "2385f9b1-5956-4820-a03b-b9a7892c2e93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.071377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv" (OuterVolumeSpecName: "kube-api-access-sbfvv") pod "2385f9b1-5956-4820-a03b-b9a7892c2e93" (UID: "2385f9b1-5956-4820-a03b-b9a7892c2e93"). InnerVolumeSpecName "kube-api-access-sbfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.071542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh" (OuterVolumeSpecName: "kube-api-access-bf4xh") pod "6538c07b-e233-452f-adeb-4a91300817de" (UID: "6538c07b-e233-452f-adeb-4a91300817de"). InnerVolumeSpecName "kube-api-access-bf4xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.168936 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf4xh\" (UniqueName: \"kubernetes.io/projected/6538c07b-e233-452f-adeb-4a91300817de-kube-api-access-bf4xh\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.168980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbfvv\" (UniqueName: \"kubernetes.io/projected/2385f9b1-5956-4820-a03b-b9a7892c2e93-kube-api-access-sbfvv\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.168995 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2385f9b1-5956-4820-a03b-b9a7892c2e93-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.169007 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6538c07b-e233-452f-adeb-4a91300817de-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.347978 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e7b8924-5f8a-4292-87d7-ddc30d845858" path="/var/lib/kubelet/pods/5e7b8924-5f8a-4292-87d7-ddc30d845858/volumes" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.349026 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f87a021-8e9a-42f4-9fd5-1779d4c39e36" path="/var/lib/kubelet/pods/6f87a021-8e9a-42f4-9fd5-1779d4c39e36/volumes" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.518309 4749 generic.go:334] "Generic (PLEG): container finished" podID="54c92c50-880d-4bf0-823f-16f13d25066b" containerID="4d8d6ff5f005eda3d7bac28c9113d2df900871e30469621e74315136f4865155" exitCode=137 Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.518631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerDied","Data":"4d8d6ff5f005eda3d7bac28c9113d2df900871e30469621e74315136f4865155"} Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.518654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6955447f7b-mp5f9" event={"ID":"54c92c50-880d-4bf0-823f-16f13d25066b","Type":"ContainerDied","Data":"f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35"} Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.518664 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7042f5114f040a0156c3fa0f6a7918c308ada0c7ff35617dd23fa5cee671a35" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.519636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7jw8g" event={"ID":"2385f9b1-5956-4820-a03b-b9a7892c2e93","Type":"ContainerDied","Data":"250f05563a0613e57f39426b93c744d153d763588c1b6b28397ee61887d80474"} Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.519759 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250f05563a0613e57f39426b93c744d153d763588c1b6b28397ee61887d80474" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.519841 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7jw8g" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.521433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64d42008-7546-4307-9953-37a51af1df8a","Type":"ContainerStarted","Data":"ecdb857f503edce43d9b356e8ba491f29b531b484df52e05fb467be874f5bb24"} Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.530120 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xbgmb" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.530968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xbgmb" event={"ID":"6538c07b-e233-452f-adeb-4a91300817de","Type":"ContainerDied","Data":"5a0142eeaffc8b9c1606fdb915079f0687422f286e367f8b37cbf4ebd45fdb9a"} Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.531010 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0142eeaffc8b9c1606fdb915079f0687422f286e367f8b37cbf4ebd45fdb9a" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.531460 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-central-agent" containerID="cri-o://fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb" gracePeriod=30 Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.532050 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="proxy-httpd" containerID="cri-o://a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6" gracePeriod=30 Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.532111 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="sg-core" containerID="cri-o://37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6" gracePeriod=30 Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.532142 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-notification-agent" containerID="cri-o://2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c" gracePeriod=30 Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.577061 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.674298 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692248 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692391 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgzvk\" (UniqueName: \"kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692560 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.692604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts\") pod \"54c92c50-880d-4bf0-823f-16f13d25066b\" (UID: \"54c92c50-880d-4bf0-823f-16f13d25066b\") " Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.703458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs" (OuterVolumeSpecName: "logs") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.725812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.726296 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts" (OuterVolumeSpecName: "scripts") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.732480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk" (OuterVolumeSpecName: "kube-api-access-tgzvk") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "kube-api-access-tgzvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.777565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data" (OuterVolumeSpecName: "config-data") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.794732 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgzvk\" (UniqueName: \"kubernetes.io/projected/54c92c50-880d-4bf0-823f-16f13d25066b-kube-api-access-tgzvk\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.794762 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54c92c50-880d-4bf0-823f-16f13d25066b-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.794772 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.794781 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.794789 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/54c92c50-880d-4bf0-823f-16f13d25066b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.828970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.829066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "54c92c50-880d-4bf0-823f-16f13d25066b" (UID: "54c92c50-880d-4bf0-823f-16f13d25066b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.897801 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:19 crc kubenswrapper[4749]: I0225 07:38:19.898029 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c92c50-880d-4bf0-823f-16f13d25066b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.036226 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.099939 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts\") pod \"8184cd81-b73e-4815-8980-c117ecbddedb\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.099996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87rk\" (UniqueName: \"kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk\") pod \"8184cd81-b73e-4815-8980-c117ecbddedb\" (UID: \"8184cd81-b73e-4815-8980-c117ecbddedb\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.100717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8184cd81-b73e-4815-8980-c117ecbddedb" (UID: "8184cd81-b73e-4815-8980-c117ecbddedb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.105784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk" (OuterVolumeSpecName: "kube-api-access-d87rk") pod "8184cd81-b73e-4815-8980-c117ecbddedb" (UID: "8184cd81-b73e-4815-8980-c117ecbddedb"). InnerVolumeSpecName "kube-api-access-d87rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.201977 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8184cd81-b73e-4815-8980-c117ecbddedb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.202027 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87rk\" (UniqueName: \"kubernetes.io/projected/8184cd81-b73e-4815-8980-c117ecbddedb-kube-api-access-d87rk\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.264125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.276219 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.280562 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.304447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9fzb\" (UniqueName: \"kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb\") pod \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.304500 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts\") pod \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\" (UID: \"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.305051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" (UID: "c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.307694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb" (OuterVolumeSpecName: "kube-api-access-f9fzb") pod "c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" (UID: "c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406"). InnerVolumeSpecName "kube-api-access-f9fzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.405839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg9cw\" (UniqueName: \"kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw\") pod \"46bc241f-11af-4075-b898-060df752a179\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.406160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts\") pod \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.406220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts\") pod \"46bc241f-11af-4075-b898-060df752a179\" (UID: \"46bc241f-11af-4075-b898-060df752a179\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.406276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gms9p\" (UniqueName: \"kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p\") pod \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\" (UID: \"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f\") " Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.406731 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9fzb\" (UniqueName: \"kubernetes.io/projected/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-kube-api-access-f9fzb\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.406748 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.408948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46bc241f-11af-4075-b898-060df752a179" (UID: "46bc241f-11af-4075-b898-060df752a179"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.409073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p" (OuterVolumeSpecName: "kube-api-access-gms9p") pod "b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" (UID: "b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f"). InnerVolumeSpecName "kube-api-access-gms9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.409078 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" (UID: "b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.413658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw" (OuterVolumeSpecName: "kube-api-access-dg9cw") pod "46bc241f-11af-4075-b898-060df752a179" (UID: "46bc241f-11af-4075-b898-060df752a179"). InnerVolumeSpecName "kube-api-access-dg9cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.508084 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gms9p\" (UniqueName: \"kubernetes.io/projected/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-kube-api-access-gms9p\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.508111 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg9cw\" (UniqueName: \"kubernetes.io/projected/46bc241f-11af-4075-b898-060df752a179-kube-api-access-dg9cw\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.508121 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.508130 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46bc241f-11af-4075-b898-060df752a179-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.540606 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.540875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c449-account-create-update-rc7pl" event={"ID":"c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406","Type":"ContainerDied","Data":"142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.540909 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142a95188a8910513113fa9f06b5e8c779fb560b3bd51da173ae12298ff9f319" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.549236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" event={"ID":"8184cd81-b73e-4815-8980-c117ecbddedb","Type":"ContainerDied","Data":"cc62d71d14eaf98511d4e14c194a91edc6866122becf19a57c9afd74158119c0"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.549277 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc62d71d14eaf98511d4e14c194a91edc6866122becf19a57c9afd74158119c0" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.549331 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-08a2-account-create-update-ng7wt" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.554469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3957039-d105-44aa-865d-08cf1bd562bf","Type":"ContainerStarted","Data":"cd0e2162214cd5da51eb15794eac0cd08789ebaaf5873e3417524e72f7bf61aa"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578145 4749 generic.go:334] "Generic (PLEG): container finished" podID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerID="a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6" exitCode=0 Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578178 4749 generic.go:334] "Generic (PLEG): container finished" podID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerID="37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6" exitCode=2 Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578186 4749 generic.go:334] "Generic (PLEG): container finished" podID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerID="2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c" exitCode=0 Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerDied","Data":"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerDied","Data":"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.578264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerDied","Data":"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.581020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7gjg9" event={"ID":"46bc241f-11af-4075-b898-060df752a179","Type":"ContainerDied","Data":"31954dcde22916c3017324c9670b4f03b41f3c598dcc6caccb638ac69516d00f"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.581043 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31954dcde22916c3017324c9670b4f03b41f3c598dcc6caccb638ac69516d00f" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.581086 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7gjg9" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.591716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64d42008-7546-4307-9953-37a51af1df8a","Type":"ContainerStarted","Data":"71698077fcadeaeb8e09437625ef99b8bcd1a1160afb4fffe752791c3d94c621"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.601460 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6955447f7b-mp5f9" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.602074 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7d49-account-create-update-2q4zh" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.606034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7d49-account-create-update-2q4zh" event={"ID":"b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f","Type":"ContainerDied","Data":"1e7074eaac38f9563d95342137187909105e2e77cee740420bdcaca4a7214aa2"} Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.606061 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7074eaac38f9563d95342137187909105e2e77cee740420bdcaca4a7214aa2" Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.719508 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:38:20 crc kubenswrapper[4749]: I0225 07:38:20.729318 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6955447f7b-mp5f9"] Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.344945 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" path="/var/lib/kubelet/pods/54c92c50-880d-4bf0-823f-16f13d25066b/volumes" Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.617512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"64d42008-7546-4307-9953-37a51af1df8a","Type":"ContainerStarted","Data":"9b2d8d39d25e13fe30f05320dc441a1b916ffec720f259045c5361efc4d192e9"} Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.622121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3957039-d105-44aa-865d-08cf1bd562bf","Type":"ContainerStarted","Data":"33927bb74c53d6bf1b891ad3110d616f8a08161be90e3b9aa3dc768926ec8b16"} Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.622161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d3957039-d105-44aa-865d-08cf1bd562bf","Type":"ContainerStarted","Data":"eea9e46dcbf1d0b1833460e438d39762cf4fbc10d1d9bc92f5ca3edcf5798590"} Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.652406 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.652386224 podStartE2EDuration="4.652386224s" podCreationTimestamp="2026-02-25 07:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:21.636823522 +0000 UTC m=+1254.998649542" watchObservedRunningTime="2026-02-25 07:38:21.652386224 +0000 UTC m=+1255.014212244" Feb 25 07:38:21 crc kubenswrapper[4749]: I0225 07:38:21.662742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.662722264 podStartE2EDuration="3.662722264s" podCreationTimestamp="2026-02-25 07:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:21.655507567 +0000 UTC m=+1255.017333587" watchObservedRunningTime="2026-02-25 07:38:21.662722264 +0000 UTC m=+1255.024548284" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.241150 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.372852 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config\") pod \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.373010 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs\") pod \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.373048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config\") pod \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.373062 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle\") pod \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.373133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg4c8\" (UniqueName: \"kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8\") pod \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\" (UID: \"d762ab38-78a5-4da4-a1a1-1831e6e069d3\") " Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.378292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8" (OuterVolumeSpecName: "kube-api-access-lg4c8") pod "d762ab38-78a5-4da4-a1a1-1831e6e069d3" (UID: "d762ab38-78a5-4da4-a1a1-1831e6e069d3"). InnerVolumeSpecName "kube-api-access-lg4c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.379072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d762ab38-78a5-4da4-a1a1-1831e6e069d3" (UID: "d762ab38-78a5-4da4-a1a1-1831e6e069d3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.428865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config" (OuterVolumeSpecName: "config") pod "d762ab38-78a5-4da4-a1a1-1831e6e069d3" (UID: "d762ab38-78a5-4da4-a1a1-1831e6e069d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.429468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d762ab38-78a5-4da4-a1a1-1831e6e069d3" (UID: "d762ab38-78a5-4da4-a1a1-1831e6e069d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.478649 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.478689 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.478703 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.478717 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg4c8\" (UniqueName: \"kubernetes.io/projected/d762ab38-78a5-4da4-a1a1-1831e6e069d3-kube-api-access-lg4c8\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.479214 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d762ab38-78a5-4da4-a1a1-1831e6e069d3" (UID: "d762ab38-78a5-4da4-a1a1-1831e6e069d3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.581014 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d762ab38-78a5-4da4-a1a1-1831e6e069d3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.641979 4749 generic.go:334] "Generic (PLEG): container finished" podID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerID="560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d" exitCode=0 Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.642023 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7675646668-99wj9" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.642028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerDied","Data":"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d"} Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.642162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7675646668-99wj9" event={"ID":"d762ab38-78a5-4da4-a1a1-1831e6e069d3","Type":"ContainerDied","Data":"f12adb14f9b67f8afce1fce27bc74052ed1fc254471765584823c02e29bf297d"} Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.642188 4749 scope.go:117] "RemoveContainer" containerID="61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.700112 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.708867 4749 scope.go:117] "RemoveContainer" containerID="560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.708931 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7675646668-99wj9"] Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.735925 4749 scope.go:117] "RemoveContainer" containerID="61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073" Feb 25 07:38:23 crc kubenswrapper[4749]: E0225 07:38:23.736403 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073\": container with ID starting with 61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073 not found: ID does not exist" containerID="61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.736447 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073"} err="failed to get container status \"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073\": rpc error: code = NotFound desc = could not find container \"61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073\": container with ID starting with 61d05b73ea4fe1a7fb2806f3a5dd0b495d4c46521d73648795f50f9f295a0073 not found: ID does not exist" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.736477 4749 scope.go:117] "RemoveContainer" containerID="560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d" Feb 25 07:38:23 crc kubenswrapper[4749]: E0225 07:38:23.736871 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d\": container with ID starting with 560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d not found: ID does not exist" containerID="560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d" Feb 25 07:38:23 crc kubenswrapper[4749]: I0225 07:38:23.736929 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d"} err="failed to get container status \"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d\": rpc error: code = NotFound desc = could not find container \"560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d\": container with ID starting with 560fe43031663e0a20295a3e42cc048a336e97ce95122662000c9e78e7b2239d not found: ID does not exist" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.332634 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" path="/var/lib/kubelet/pods/d762ab38-78a5-4da4-a1a1-1831e6e069d3/volumes" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.360977 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411571 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrc5\" (UniqueName: \"kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.411999 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd\") pod \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\" (UID: \"f571c29f-2f4e-4c47-affe-33b84e1cc8fe\") " Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.412643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.412850 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.417162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts" (OuterVolumeSpecName: "scripts") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.418638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5" (OuterVolumeSpecName: "kube-api-access-gjrc5") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "kube-api-access-gjrc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.467819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.504726 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514782 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514831 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514846 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrc5\" (UniqueName: \"kubernetes.io/projected/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-kube-api-access-gjrc5\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514861 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514872 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.514883 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.531771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data" (OuterVolumeSpecName: "config-data") pod "f571c29f-2f4e-4c47-affe-33b84e1cc8fe" (UID: "f571c29f-2f4e-4c47-affe-33b84e1cc8fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.617825 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f571c29f-2f4e-4c47-affe-33b84e1cc8fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.693233 4749 generic.go:334] "Generic (PLEG): container finished" podID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerID="fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb" exitCode=0 Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.693284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerDied","Data":"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb"} Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.693314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f571c29f-2f4e-4c47-affe-33b84e1cc8fe","Type":"ContainerDied","Data":"d510ef39910af563fe1d0d39509c77ec8c7704098857de411802003316da1b1a"} Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.693312 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.693335 4749 scope.go:117] "RemoveContainer" containerID="a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.735048 4749 scope.go:117] "RemoveContainer" containerID="37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.750172 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.767777 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775306 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775375 4749 scope.go:117] "RemoveContainer" containerID="2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775766 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775791 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775808 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775817 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775836 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="proxy-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="proxy-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775859 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-notification-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775868 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-notification-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775886 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-api" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775896 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-api" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-central-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775917 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-central-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775929 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6538c07b-e233-452f-adeb-4a91300817de" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775937 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6538c07b-e233-452f-adeb-4a91300817de" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775951 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.775959 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.775984 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bc241f-11af-4075-b898-060df752a179" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776003 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bc241f-11af-4075-b898-060df752a179" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.776013 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8184cd81-b73e-4815-8980-c117ecbddedb" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776021 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8184cd81-b73e-4815-8980-c117ecbddedb" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.776035 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="sg-core" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776044 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="sg-core" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.776055 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon-log" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776063 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon-log" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.776077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.776100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2385f9b1-5956-4820-a03b-b9a7892c2e93" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776108 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2385f9b1-5956-4820-a03b-b9a7892c2e93" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776302 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-notification-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776319 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8184cd81-b73e-4815-8980-c117ecbddedb" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776340 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776362 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bc241f-11af-4075-b898-060df752a179" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776376 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-api" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776391 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776403 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="proxy-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776417 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6538c07b-e233-452f-adeb-4a91300817de" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776432 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="sg-core" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776441 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" containerName="mariadb-account-create-update" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776456 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d762ab38-78a5-4da4-a1a1-1831e6e069d3" containerName="neutron-httpd" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776464 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" containerName="ceilometer-central-agent" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776477 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2385f9b1-5956-4820-a03b-b9a7892c2e93" containerName="mariadb-database-create" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.776491 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c92c50-880d-4bf0-823f-16f13d25066b" containerName="horizon-log" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.806390 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.806491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.810639 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.811046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.820879 4749 scope.go:117] "RemoveContainer" containerID="fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.849212 4749 scope.go:117] "RemoveContainer" containerID="a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.849637 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6\": container with ID starting with a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6 not found: ID does not exist" containerID="a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.849671 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6"} err="failed to get container status \"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6\": rpc error: code = NotFound desc = could not find container \"a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6\": container with ID starting with a27d4735117045972520874f15fae9841164ae1153dec53d32af1dcb80a8b9d6 not found: ID does not exist" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.849693 4749 scope.go:117] "RemoveContainer" containerID="37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.849964 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6\": container with ID starting with 37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6 not found: ID does not exist" containerID="37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.850000 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6"} err="failed to get container status \"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6\": rpc error: code = NotFound desc = could not find container \"37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6\": container with ID starting with 37e32c60bdd95e4f72734f8352a1d5dbef4e3dcf1a778fa878553dfe4555f7f6 not found: ID does not exist" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.850026 4749 scope.go:117] "RemoveContainer" containerID="2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.850871 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c\": container with ID starting with 2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c not found: ID does not exist" containerID="2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.850916 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c"} err="failed to get container status \"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c\": rpc error: code = NotFound desc = could not find container \"2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c\": container with ID starting with 2ddccae15fafa96a13146d4bb2820540618cfbdb469f372da8212fe951af6c9c not found: ID does not exist" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.850942 4749 scope.go:117] "RemoveContainer" containerID="fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb" Feb 25 07:38:25 crc kubenswrapper[4749]: E0225 07:38:25.851257 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb\": container with ID starting with fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb not found: ID does not exist" containerID="fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.851279 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb"} err="failed to get container status \"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb\": rpc error: code = NotFound desc = could not find container \"fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb\": container with ID starting with fd8ec682ba814e38d504ecb19aaed0f9220a2938d9592fc407dd664d549d43fb not found: ID does not exist" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.923603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.923964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565rd\" (UniqueName: \"kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.923996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.924049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.924098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.924340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:25 crc kubenswrapper[4749]: I0225 07:38:25.924525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027134 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565rd\" (UniqueName: \"kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.027972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.028031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.028384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.033277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.033301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.036796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.037303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.067855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565rd\" (UniqueName: \"kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd\") pod \"ceilometer-0\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.123796 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mlb5c"] Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.124808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.127150 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7hxrs" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.127544 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.127763 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.128604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.159444 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mlb5c"] Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.230826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.230919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tcb\" (UniqueName: \"kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.230966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.230987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.332852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.333222 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tcb\" (UniqueName: \"kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.333244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.333264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.336910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.337910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.338364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.351876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tcb\" (UniqueName: \"kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb\") pod \"nova-cell0-conductor-db-sync-mlb5c\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.552524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.655311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:26 crc kubenswrapper[4749]: W0225 07:38:26.660101 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68241ef1_ed7d_482a_b88f_e9a8d15696d1.slice/crio-f53c9103b1539697d2e5e153e4adf7eb65174e2f2bb07786ea078306ccdba0ad WatchSource:0}: Error finding container f53c9103b1539697d2e5e153e4adf7eb65174e2f2bb07786ea078306ccdba0ad: Status 404 returned error can't find the container with id f53c9103b1539697d2e5e153e4adf7eb65174e2f2bb07786ea078306ccdba0ad Feb 25 07:38:26 crc kubenswrapper[4749]: I0225 07:38:26.755935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerStarted","Data":"f53c9103b1539697d2e5e153e4adf7eb65174e2f2bb07786ea078306ccdba0ad"} Feb 25 07:38:27 crc kubenswrapper[4749]: I0225 07:38:27.070178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mlb5c"] Feb 25 07:38:27 crc kubenswrapper[4749]: I0225 07:38:27.336225 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f571c29f-2f4e-4c47-affe-33b84e1cc8fe" path="/var/lib/kubelet/pods/f571c29f-2f4e-4c47-affe-33b84e1cc8fe/volumes" Feb 25 07:38:27 crc kubenswrapper[4749]: I0225 07:38:27.783925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerStarted","Data":"8f310ef1c8cc90835784623c952fde0e2e24343df163bc6ebe14b760196878cf"} Feb 25 07:38:27 crc kubenswrapper[4749]: I0225 07:38:27.786411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" event={"ID":"3f52cb2d-084a-4cf4-95c9-facab10be752","Type":"ContainerStarted","Data":"d00a01d653bdfe16402bd4f11b9fe66accca7f1b5d4f36037a446aed45147ebe"} Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.288650 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.288923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.335480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.336899 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.796473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerStarted","Data":"aa681d9960d27ab1858989a1a8328308b64f0d77b163cbbe65b9bdac5675136d"} Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.797035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerStarted","Data":"3fb3df8a710cb2b36dc3c91e5e0e715c72185b8b9e8a04e94df3c599b86b2827"} Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.797062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.797101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.898052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.898088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.934129 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 07:38:28 crc kubenswrapper[4749]: I0225 07:38:28.942738 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 25 07:38:29 crc kubenswrapper[4749]: I0225 07:38:29.805234 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 07:38:29 crc kubenswrapper[4749]: I0225 07:38:29.805276 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 25 07:38:30 crc kubenswrapper[4749]: I0225 07:38:30.611016 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:30 crc kubenswrapper[4749]: I0225 07:38:30.629761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 25 07:38:31 crc kubenswrapper[4749]: I0225 07:38:31.926726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 07:38:31 crc kubenswrapper[4749]: I0225 07:38:31.927053 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 25 07:38:32 crc kubenswrapper[4749]: I0225 07:38:32.004046 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 25 07:38:36 crc kubenswrapper[4749]: I0225 07:38:36.907233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerStarted","Data":"3a9d045f547bc2ad851f7881feea1a48bf031b6789971c36947c49db23a00ca4"} Feb 25 07:38:36 crc kubenswrapper[4749]: I0225 07:38:36.907771 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:38:36 crc kubenswrapper[4749]: I0225 07:38:36.911205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" event={"ID":"3f52cb2d-084a-4cf4-95c9-facab10be752","Type":"ContainerStarted","Data":"2a450491912714d76d5fadba04571a48c41b9b9a27ae123cc77f3021d6fafad3"} Feb 25 07:38:36 crc kubenswrapper[4749]: I0225 07:38:36.930390 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.531141382 podStartE2EDuration="11.930366846s" podCreationTimestamp="2026-02-25 07:38:25 +0000 UTC" firstStartedPulling="2026-02-25 07:38:26.668051688 +0000 UTC m=+1260.029877708" lastFinishedPulling="2026-02-25 07:38:36.067277152 +0000 UTC m=+1269.429103172" observedRunningTime="2026-02-25 07:38:36.927333104 +0000 UTC m=+1270.289159124" watchObservedRunningTime="2026-02-25 07:38:36.930366846 +0000 UTC m=+1270.292192906" Feb 25 07:38:36 crc kubenswrapper[4749]: I0225 07:38:36.951434 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" podStartSLOduration=1.960068892 podStartE2EDuration="10.951417036s" podCreationTimestamp="2026-02-25 07:38:26 +0000 UTC" firstStartedPulling="2026-02-25 07:38:27.080387212 +0000 UTC m=+1260.442213242" lastFinishedPulling="2026-02-25 07:38:36.071735366 +0000 UTC m=+1269.433561386" observedRunningTime="2026-02-25 07:38:36.939211461 +0000 UTC m=+1270.301037481" watchObservedRunningTime="2026-02-25 07:38:36.951417036 +0000 UTC m=+1270.313243056" Feb 25 07:38:38 crc kubenswrapper[4749]: I0225 07:38:38.428107 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:38 crc kubenswrapper[4749]: I0225 07:38:38.933158 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-central-agent" containerID="cri-o://8f310ef1c8cc90835784623c952fde0e2e24343df163bc6ebe14b760196878cf" gracePeriod=30 Feb 25 07:38:38 crc kubenswrapper[4749]: I0225 07:38:38.933236 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-notification-agent" containerID="cri-o://3fb3df8a710cb2b36dc3c91e5e0e715c72185b8b9e8a04e94df3c599b86b2827" gracePeriod=30 Feb 25 07:38:38 crc kubenswrapper[4749]: I0225 07:38:38.933255 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="proxy-httpd" containerID="cri-o://3a9d045f547bc2ad851f7881feea1a48bf031b6789971c36947c49db23a00ca4" gracePeriod=30 Feb 25 07:38:38 crc kubenswrapper[4749]: I0225 07:38:38.933255 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="sg-core" containerID="cri-o://aa681d9960d27ab1858989a1a8328308b64f0d77b163cbbe65b9bdac5675136d" gracePeriod=30 Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948099 4749 generic.go:334] "Generic (PLEG): container finished" podID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerID="3a9d045f547bc2ad851f7881feea1a48bf031b6789971c36947c49db23a00ca4" exitCode=0 Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948471 4749 generic.go:334] "Generic (PLEG): container finished" podID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerID="aa681d9960d27ab1858989a1a8328308b64f0d77b163cbbe65b9bdac5675136d" exitCode=2 Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948483 4749 generic.go:334] "Generic (PLEG): container finished" podID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerID="3fb3df8a710cb2b36dc3c91e5e0e715c72185b8b9e8a04e94df3c599b86b2827" exitCode=0 Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948495 4749 generic.go:334] "Generic (PLEG): container finished" podID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerID="8f310ef1c8cc90835784623c952fde0e2e24343df163bc6ebe14b760196878cf" exitCode=0 Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerDied","Data":"3a9d045f547bc2ad851f7881feea1a48bf031b6789971c36947c49db23a00ca4"} Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerDied","Data":"aa681d9960d27ab1858989a1a8328308b64f0d77b163cbbe65b9bdac5675136d"} Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerDied","Data":"3fb3df8a710cb2b36dc3c91e5e0e715c72185b8b9e8a04e94df3c599b86b2827"} Feb 25 07:38:39 crc kubenswrapper[4749]: I0225 07:38:39.948624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerDied","Data":"8f310ef1c8cc90835784623c952fde0e2e24343df163bc6ebe14b760196878cf"} Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.117647 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.215350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.215771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.216043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565rd\" (UniqueName: \"kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.216886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.216937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.216969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.216995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.217156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd\") pod \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\" (UID: \"68241ef1-ed7d-482a-b88f-e9a8d15696d1\") " Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.217604 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.217949 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.222392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts" (OuterVolumeSpecName: "scripts") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.253374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.253465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd" (OuterVolumeSpecName: "kube-api-access-565rd") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "kube-api-access-565rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.319174 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68241ef1-ed7d-482a-b88f-e9a8d15696d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.319221 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565rd\" (UniqueName: \"kubernetes.io/projected/68241ef1-ed7d-482a-b88f-e9a8d15696d1-kube-api-access-565rd\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.319237 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.319248 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.329864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data" (OuterVolumeSpecName: "config-data") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.331826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68241ef1-ed7d-482a-b88f-e9a8d15696d1" (UID: "68241ef1-ed7d-482a-b88f-e9a8d15696d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.421083 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.421215 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68241ef1-ed7d-482a-b88f-e9a8d15696d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.439205 4749 scope.go:117] "RemoveContainer" containerID="468e4c7106d078844a2ce0ddaa61b3252f80439b2616458d61bcdfdc0fcae9dc" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.968725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68241ef1-ed7d-482a-b88f-e9a8d15696d1","Type":"ContainerDied","Data":"f53c9103b1539697d2e5e153e4adf7eb65174e2f2bb07786ea078306ccdba0ad"} Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.968902 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:40 crc kubenswrapper[4749]: I0225 07:38:40.969181 4749 scope.go:117] "RemoveContainer" containerID="3a9d045f547bc2ad851f7881feea1a48bf031b6789971c36947c49db23a00ca4" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.012326 4749 scope.go:117] "RemoveContainer" containerID="aa681d9960d27ab1858989a1a8328308b64f0d77b163cbbe65b9bdac5675136d" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.023638 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.034230 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.042093 4749 scope.go:117] "RemoveContainer" containerID="3fb3df8a710cb2b36dc3c91e5e0e715c72185b8b9e8a04e94df3c599b86b2827" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.056913 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:41 crc kubenswrapper[4749]: E0225 07:38:41.057309 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-notification-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057332 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-notification-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: E0225 07:38:41.057356 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-central-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057364 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-central-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: E0225 07:38:41.057393 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="sg-core" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057401 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="sg-core" Feb 25 07:38:41 crc kubenswrapper[4749]: E0225 07:38:41.057416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="proxy-httpd" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="proxy-httpd" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057665 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="sg-core" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057682 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="proxy-httpd" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057697 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-notification-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.057725 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" containerName="ceilometer-central-agent" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.059621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.064930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.065257 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.078365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145747 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln6dw\" (UniqueName: \"kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.145967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln6dw\" (UniqueName: \"kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.247501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.248004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.248236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.251657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.252412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.253477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.262030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.263519 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln6dw\" (UniqueName: \"kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw\") pod \"ceilometer-0\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.331718 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68241ef1-ed7d-482a-b88f-e9a8d15696d1" path="/var/lib/kubelet/pods/68241ef1-ed7d-482a-b88f-e9a8d15696d1/volumes" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.445877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.559138 4749 scope.go:117] "RemoveContainer" containerID="8f310ef1c8cc90835784623c952fde0e2e24343df163bc6ebe14b760196878cf" Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.875178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:38:41 crc kubenswrapper[4749]: W0225 07:38:41.884281 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf55fe179_96dd_4ed7_8a62_5e3ca048200f.slice/crio-2f4e6b885255df45ade2772b416af5a398a52ee497540d58713501231605a449 WatchSource:0}: Error finding container 2f4e6b885255df45ade2772b416af5a398a52ee497540d58713501231605a449: Status 404 returned error can't find the container with id 2f4e6b885255df45ade2772b416af5a398a52ee497540d58713501231605a449 Feb 25 07:38:41 crc kubenswrapper[4749]: I0225 07:38:41.980931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerStarted","Data":"2f4e6b885255df45ade2772b416af5a398a52ee497540d58713501231605a449"} Feb 25 07:38:44 crc kubenswrapper[4749]: I0225 07:38:44.018819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerStarted","Data":"b8221002f7ca405abe79eb80840e7b865672e8a54af9d0f084955e497e58bccc"} Feb 25 07:38:44 crc kubenswrapper[4749]: I0225 07:38:44.019621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerStarted","Data":"576a35bf4d501a7bd5a5abcead4e116551289c5ebc27021fb0bea3ee9d50dd57"} Feb 25 07:38:45 crc kubenswrapper[4749]: I0225 07:38:45.032353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerStarted","Data":"004b9bad1ef9bdb43eda8a4e624740d0c1a80e06e69bf9c7a90fd7c8d9a87b9d"} Feb 25 07:38:46 crc kubenswrapper[4749]: I0225 07:38:46.049318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerStarted","Data":"ad78b9fabf8821ac2dcf29ade5240ce787b153daf923955933c705ddbaec3df0"} Feb 25 07:38:46 crc kubenswrapper[4749]: I0225 07:38:46.049860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:38:47 crc kubenswrapper[4749]: I0225 07:38:47.846021 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5e7b8924-5f8a-4292-87d7-ddc30d845858"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5e7b8924-5f8a-4292-87d7-ddc30d845858] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5e7b8924_5f8a_4292_87d7_ddc30d845858.slice" Feb 25 07:38:48 crc kubenswrapper[4749]: I0225 07:38:48.071948 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f52cb2d-084a-4cf4-95c9-facab10be752" containerID="2a450491912714d76d5fadba04571a48c41b9b9a27ae123cc77f3021d6fafad3" exitCode=0 Feb 25 07:38:48 crc kubenswrapper[4749]: I0225 07:38:48.071993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" event={"ID":"3f52cb2d-084a-4cf4-95c9-facab10be752","Type":"ContainerDied","Data":"2a450491912714d76d5fadba04571a48c41b9b9a27ae123cc77f3021d6fafad3"} Feb 25 07:38:48 crc kubenswrapper[4749]: I0225 07:38:48.097431 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.179944291 podStartE2EDuration="7.097408175s" podCreationTimestamp="2026-02-25 07:38:41 +0000 UTC" firstStartedPulling="2026-02-25 07:38:41.886966803 +0000 UTC m=+1275.248792823" lastFinishedPulling="2026-02-25 07:38:45.804430647 +0000 UTC m=+1279.166256707" observedRunningTime="2026-02-25 07:38:46.069890401 +0000 UTC m=+1279.431716461" watchObservedRunningTime="2026-02-25 07:38:48.097408175 +0000 UTC m=+1281.459234235" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.470702 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.603043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle\") pod \"3f52cb2d-084a-4cf4-95c9-facab10be752\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.603091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts\") pod \"3f52cb2d-084a-4cf4-95c9-facab10be752\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.603161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9tcb\" (UniqueName: \"kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb\") pod \"3f52cb2d-084a-4cf4-95c9-facab10be752\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.603222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data\") pod \"3f52cb2d-084a-4cf4-95c9-facab10be752\" (UID: \"3f52cb2d-084a-4cf4-95c9-facab10be752\") " Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.609805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb" (OuterVolumeSpecName: "kube-api-access-g9tcb") pod "3f52cb2d-084a-4cf4-95c9-facab10be752" (UID: "3f52cb2d-084a-4cf4-95c9-facab10be752"). InnerVolumeSpecName "kube-api-access-g9tcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.622926 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts" (OuterVolumeSpecName: "scripts") pod "3f52cb2d-084a-4cf4-95c9-facab10be752" (UID: "3f52cb2d-084a-4cf4-95c9-facab10be752"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.643648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f52cb2d-084a-4cf4-95c9-facab10be752" (UID: "3f52cb2d-084a-4cf4-95c9-facab10be752"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.662451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data" (OuterVolumeSpecName: "config-data") pod "3f52cb2d-084a-4cf4-95c9-facab10be752" (UID: "3f52cb2d-084a-4cf4-95c9-facab10be752"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.705673 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.705717 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.705732 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f52cb2d-084a-4cf4-95c9-facab10be752-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:49 crc kubenswrapper[4749]: I0225 07:38:49.705746 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9tcb\" (UniqueName: \"kubernetes.io/projected/3f52cb2d-084a-4cf4-95c9-facab10be752-kube-api-access-g9tcb\") on node \"crc\" DevicePath \"\"" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.095460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" event={"ID":"3f52cb2d-084a-4cf4-95c9-facab10be752","Type":"ContainerDied","Data":"d00a01d653bdfe16402bd4f11b9fe66accca7f1b5d4f36037a446aed45147ebe"} Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.095512 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d00a01d653bdfe16402bd4f11b9fe66accca7f1b5d4f36037a446aed45147ebe" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.095582 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mlb5c" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.275697 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 07:38:50 crc kubenswrapper[4749]: E0225 07:38:50.276209 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f52cb2d-084a-4cf4-95c9-facab10be752" containerName="nova-cell0-conductor-db-sync" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.276243 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f52cb2d-084a-4cf4-95c9-facab10be752" containerName="nova-cell0-conductor-db-sync" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.276585 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f52cb2d-084a-4cf4-95c9-facab10be752" containerName="nova-cell0-conductor-db-sync" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.277429 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.281366 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.281773 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7hxrs" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.307689 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.421856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvnx\" (UniqueName: \"kubernetes.io/projected/1bc34300-d817-4b83-865e-2cc2c5ffb31a-kube-api-access-5jvnx\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.422019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.422140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.524266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvnx\" (UniqueName: \"kubernetes.io/projected/1bc34300-d817-4b83-865e-2cc2c5ffb31a-kube-api-access-5jvnx\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.524351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.524404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.529323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.542251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc34300-d817-4b83-865e-2cc2c5ffb31a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.547255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvnx\" (UniqueName: \"kubernetes.io/projected/1bc34300-d817-4b83-865e-2cc2c5ffb31a-kube-api-access-5jvnx\") pod \"nova-cell0-conductor-0\" (UID: \"1bc34300-d817-4b83-865e-2cc2c5ffb31a\") " pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:50 crc kubenswrapper[4749]: I0225 07:38:50.614981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:51 crc kubenswrapper[4749]: I0225 07:38:51.106647 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 25 07:38:51 crc kubenswrapper[4749]: W0225 07:38:51.122538 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc34300_d817_4b83_865e_2cc2c5ffb31a.slice/crio-8991981877ea60ba4d2da3ce86a3743f38b3e1eba5deca3b2abb46f712db2049 WatchSource:0}: Error finding container 8991981877ea60ba4d2da3ce86a3743f38b3e1eba5deca3b2abb46f712db2049: Status 404 returned error can't find the container with id 8991981877ea60ba4d2da3ce86a3743f38b3e1eba5deca3b2abb46f712db2049 Feb 25 07:38:52 crc kubenswrapper[4749]: I0225 07:38:52.120931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc34300-d817-4b83-865e-2cc2c5ffb31a","Type":"ContainerStarted","Data":"889fb5a71e63317e55bf3f6caf4f6d6831e83b8f3c28eb08f751ad05d30fcc6f"} Feb 25 07:38:52 crc kubenswrapper[4749]: I0225 07:38:52.121327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1bc34300-d817-4b83-865e-2cc2c5ffb31a","Type":"ContainerStarted","Data":"8991981877ea60ba4d2da3ce86a3743f38b3e1eba5deca3b2abb46f712db2049"} Feb 25 07:38:52 crc kubenswrapper[4749]: I0225 07:38:52.121357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 25 07:38:52 crc kubenswrapper[4749]: I0225 07:38:52.150998 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.15097233 podStartE2EDuration="2.15097233s" podCreationTimestamp="2026-02-25 07:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:38:52.136998815 +0000 UTC m=+1285.498824835" watchObservedRunningTime="2026-02-25 07:38:52.15097233 +0000 UTC m=+1285.512798380" Feb 25 07:39:00 crc kubenswrapper[4749]: I0225 07:39:00.663393 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.248758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-n7jcl"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.253012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.255840 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.255880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.270144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-n7jcl"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.372478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.372608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjmn\" (UniqueName: \"kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.372669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.372741 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.472351 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.473587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.473751 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.473753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.478535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.479726 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.479918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.499338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.507030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjmn\" (UniqueName: \"kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.514244 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.518406 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.542107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjmn\" (UniqueName: \"kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn\") pod \"nova-cell0-cell-mapping-n7jcl\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.590946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.592604 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.599211 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.602859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.608963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.608996 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64s4\" (UniqueName: \"kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.609092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.609116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.627471 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.628705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.638056 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.654941 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.657193 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.685513 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.687497 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64s4\" (UniqueName: \"kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r59f\" (UniqueName: \"kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vckl\" (UniqueName: \"kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-przbw\" (UniqueName: \"kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714410 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.714445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.715011 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.721283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.726374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.739924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64s4\" (UniqueName: \"kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4\") pod \"nova-metadata-0\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " pod="openstack/nova-metadata-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.762136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.795689 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.796930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.803188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r59f\" (UniqueName: \"kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vckl\" (UniqueName: \"kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.815987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.816006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-przbw\" (UniqueName: \"kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.816024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.816043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.816059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.817001 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.821124 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.823826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.824166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.824202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.824760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.826163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.829720 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.831303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.833855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.835161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.843724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vckl\" (UniqueName: \"kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl\") pod \"dnsmasq-dns-bccf8f775-d5nx6\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.844154 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r59f\" (UniqueName: \"kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f\") pod \"nova-scheduler-0\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.845718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-przbw\" (UniqueName: \"kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw\") pod \"nova-api-0\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " pod="openstack/nova-api-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.918248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.918584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l98rj\" (UniqueName: \"kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:01 crc kubenswrapper[4749]: I0225 07:39:01.918631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.019653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l98rj\" (UniqueName: \"kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.019717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.019813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.029207 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.036530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.048921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.059234 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.064473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.064776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.068002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l98rj\" (UniqueName: \"kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.122232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.230060 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-n7jcl"] Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.354434 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xxh4"] Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.357470 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.361355 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.361514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.368296 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xxh4"] Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.436765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.438503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.438819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.438928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzm4g\" (UniqueName: \"kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.541699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.541763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzm4g\" (UniqueName: \"kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.541808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.541909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.553144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.555012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.557095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzm4g\" (UniqueName: \"kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.558131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xxh4\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.602444 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:02 crc kubenswrapper[4749]: W0225 07:39:02.604397 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e64b9e0_c5c2_4b11_a6dc_72fffdc15263.slice/crio-a0576409009276195aa64204901c98faca57d82290117030bc5ad938802e25d5 WatchSource:0}: Error finding container a0576409009276195aa64204901c98faca57d82290117030bc5ad938802e25d5: Status 404 returned error can't find the container with id a0576409009276195aa64204901c98faca57d82290117030bc5ad938802e25d5 Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.627924 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:02 crc kubenswrapper[4749]: W0225 07:39:02.629036 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3fee2b8_17be_455a_a4dd_09aac96819d4.slice/crio-9777d88e9b135535f55638561d111d5e21c110f5723a496def047efba58a0e8d WatchSource:0}: Error finding container 9777d88e9b135535f55638561d111d5e21c110f5723a496def047efba58a0e8d: Status 404 returned error can't find the container with id 9777d88e9b135535f55638561d111d5e21c110f5723a496def047efba58a0e8d Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.683337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.702126 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.834338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:02 crc kubenswrapper[4749]: W0225 07:39:02.837254 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b424e62_82ef_4540_b049_c673cce6de0e.slice/crio-44bfff790476ccceeefe0b8164c6ebe5098bffa10e8ca44373dd4a4d6a18472a WatchSource:0}: Error finding container 44bfff790476ccceeefe0b8164c6ebe5098bffa10e8ca44373dd4a4d6a18472a: Status 404 returned error can't find the container with id 44bfff790476ccceeefe0b8164c6ebe5098bffa10e8ca44373dd4a4d6a18472a Feb 25 07:39:02 crc kubenswrapper[4749]: I0225 07:39:02.843930 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.160838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xxh4"] Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.264007 4749 generic.go:334] "Generic (PLEG): container finished" podID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerID="683ccc7b1eac6c80b6e934d381b28fea247e79059c913f484d4efca516045894" exitCode=0 Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.264091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" event={"ID":"37f54cc1-6ecd-435a-b969-fceaa88d6d4f","Type":"ContainerDied","Data":"683ccc7b1eac6c80b6e934d381b28fea247e79059c913f484d4efca516045894"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.264119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" event={"ID":"37f54cc1-6ecd-435a-b969-fceaa88d6d4f","Type":"ContainerStarted","Data":"cb0239b4fabff63dcf445a9b8d7a8dc670507903714edc301e07e9aa571f5f69"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.273273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerStarted","Data":"9777d88e9b135535f55638561d111d5e21c110f5723a496def047efba58a0e8d"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.276386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6023724-dae0-4f04-8e64-e45d0dd22342","Type":"ContainerStarted","Data":"2dc9bb563d6477bf9785ac351c773da4e8d13edbc922c155795c2ae27815e857"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.278778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b424e62-82ef-4540-b049-c673cce6de0e","Type":"ContainerStarted","Data":"44bfff790476ccceeefe0b8164c6ebe5098bffa10e8ca44373dd4a4d6a18472a"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.283484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" event={"ID":"4b0b654d-70c1-4ea5-b508-1c365f26720a","Type":"ContainerStarted","Data":"47954b8a3c15ad81aa124eca51afde8d4c621313cd27693eeac4162dbc94d106"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.285200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerStarted","Data":"a0576409009276195aa64204901c98faca57d82290117030bc5ad938802e25d5"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.288059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n7jcl" event={"ID":"58ee757a-870c-4c47-847c-bed7addddb21","Type":"ContainerStarted","Data":"1b127a38cf4d7012536adf90f4ac9dd659ec816d4b71abb427527ef47dcd37d3"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.288125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n7jcl" event={"ID":"58ee757a-870c-4c47-847c-bed7addddb21","Type":"ContainerStarted","Data":"1f7d1290be367755ab48946d50d5e980f631528039eec3e0bfab3b8fb3106682"} Feb 25 07:39:03 crc kubenswrapper[4749]: I0225 07:39:03.310863 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-n7jcl" podStartSLOduration=2.310845985 podStartE2EDuration="2.310845985s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:03.304951767 +0000 UTC m=+1296.666777787" watchObservedRunningTime="2026-02-25 07:39:03.310845985 +0000 UTC m=+1296.672672005" Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.315000 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" event={"ID":"4b0b654d-70c1-4ea5-b508-1c365f26720a","Type":"ContainerStarted","Data":"1f08ac38d6d2521c0f4244d5bc80a4fe0909319c97570ac09afd06277a253b74"} Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.320047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" event={"ID":"37f54cc1-6ecd-435a-b969-fceaa88d6d4f","Type":"ContainerStarted","Data":"73251f4d1d186f22486ad55e47907bae7d9f70049407db8c086b5b7e446dcd4f"} Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.320166 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.340367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" podStartSLOduration=2.340350183 podStartE2EDuration="2.340350183s" podCreationTimestamp="2026-02-25 07:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:04.337310063 +0000 UTC m=+1297.699136093" watchObservedRunningTime="2026-02-25 07:39:04.340350183 +0000 UTC m=+1297.702176193" Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.363901 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" podStartSLOduration=3.363885122 podStartE2EDuration="3.363885122s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:04.358463805 +0000 UTC m=+1297.720289845" watchObservedRunningTime="2026-02-25 07:39:04.363885122 +0000 UTC m=+1297.725711142" Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.910210 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:04 crc kubenswrapper[4749]: I0225 07:39:04.921041 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.344678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6023724-dae0-4f04-8e64-e45d0dd22342","Type":"ContainerStarted","Data":"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.345831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b424e62-82ef-4540-b049-c673cce6de0e","Type":"ContainerStarted","Data":"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.345945 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6b424e62-82ef-4540-b049-c673cce6de0e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d" gracePeriod=30 Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.349104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerStarted","Data":"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.349144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerStarted","Data":"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.351750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerStarted","Data":"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.351813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerStarted","Data":"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d"} Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.351986 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-log" containerID="cri-o://6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" gracePeriod=30 Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.352449 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-metadata" containerID="cri-o://535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" gracePeriod=30 Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.461261 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.056723597 podStartE2EDuration="6.461237234s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="2026-02-25 07:39:02.633724153 +0000 UTC m=+1295.995550163" lastFinishedPulling="2026-02-25 07:39:06.03823775 +0000 UTC m=+1299.400063800" observedRunningTime="2026-02-25 07:39:07.455338657 +0000 UTC m=+1300.817164677" watchObservedRunningTime="2026-02-25 07:39:07.461237234 +0000 UTC m=+1300.823063254" Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.483650 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.287228646 podStartE2EDuration="6.483633956s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="2026-02-25 07:39:02.8418147 +0000 UTC m=+1296.203640720" lastFinishedPulling="2026-02-25 07:39:06.03821998 +0000 UTC m=+1299.400046030" observedRunningTime="2026-02-25 07:39:07.482745405 +0000 UTC m=+1300.844571415" watchObservedRunningTime="2026-02-25 07:39:07.483633956 +0000 UTC m=+1300.845459976" Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.507911 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.18864521 podStartE2EDuration="6.507890761s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="2026-02-25 07:39:02.715543009 +0000 UTC m=+1296.077369029" lastFinishedPulling="2026-02-25 07:39:06.03478852 +0000 UTC m=+1299.396614580" observedRunningTime="2026-02-25 07:39:07.499565777 +0000 UTC m=+1300.861391807" watchObservedRunningTime="2026-02-25 07:39:07.507890761 +0000 UTC m=+1300.869716781" Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.523868 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.098575002 podStartE2EDuration="6.523823632s" podCreationTimestamp="2026-02-25 07:39:01 +0000 UTC" firstStartedPulling="2026-02-25 07:39:02.610807089 +0000 UTC m=+1295.972633109" lastFinishedPulling="2026-02-25 07:39:06.036055719 +0000 UTC m=+1299.397881739" observedRunningTime="2026-02-25 07:39:07.514467964 +0000 UTC m=+1300.876294004" watchObservedRunningTime="2026-02-25 07:39:07.523823632 +0000 UTC m=+1300.885649642" Feb 25 07:39:07 crc kubenswrapper[4749]: I0225 07:39:07.935264 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.068261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs\") pod \"f3fee2b8-17be-455a-a4dd-09aac96819d4\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.068334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data\") pod \"f3fee2b8-17be-455a-a4dd-09aac96819d4\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.068450 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64s4\" (UniqueName: \"kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4\") pod \"f3fee2b8-17be-455a-a4dd-09aac96819d4\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.068481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle\") pod \"f3fee2b8-17be-455a-a4dd-09aac96819d4\" (UID: \"f3fee2b8-17be-455a-a4dd-09aac96819d4\") " Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.068887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs" (OuterVolumeSpecName: "logs") pod "f3fee2b8-17be-455a-a4dd-09aac96819d4" (UID: "f3fee2b8-17be-455a-a4dd-09aac96819d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.073289 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4" (OuterVolumeSpecName: "kube-api-access-z64s4") pod "f3fee2b8-17be-455a-a4dd-09aac96819d4" (UID: "f3fee2b8-17be-455a-a4dd-09aac96819d4"). InnerVolumeSpecName "kube-api-access-z64s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.092009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data" (OuterVolumeSpecName: "config-data") pod "f3fee2b8-17be-455a-a4dd-09aac96819d4" (UID: "f3fee2b8-17be-455a-a4dd-09aac96819d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.106758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3fee2b8-17be-455a-a4dd-09aac96819d4" (UID: "f3fee2b8-17be-455a-a4dd-09aac96819d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.170124 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3fee2b8-17be-455a-a4dd-09aac96819d4-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.170154 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.170165 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64s4\" (UniqueName: \"kubernetes.io/projected/f3fee2b8-17be-455a-a4dd-09aac96819d4-kube-api-access-z64s4\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.170175 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3fee2b8-17be-455a-a4dd-09aac96819d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.365984 4749 generic.go:334] "Generic (PLEG): container finished" podID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerID="535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" exitCode=0 Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366028 4749 generic.go:334] "Generic (PLEG): container finished" podID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerID="6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" exitCode=143 Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366083 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerDied","Data":"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608"} Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerDied","Data":"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d"} Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3fee2b8-17be-455a-a4dd-09aac96819d4","Type":"ContainerDied","Data":"9777d88e9b135535f55638561d111d5e21c110f5723a496def047efba58a0e8d"} Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.366280 4749 scope.go:117] "RemoveContainer" containerID="535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.415964 4749 scope.go:117] "RemoveContainer" containerID="6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.424472 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.439977 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.464353 4749 scope.go:117] "RemoveContainer" containerID="535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.465875 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:08 crc kubenswrapper[4749]: E0225 07:39:08.466722 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608\": container with ID starting with 535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608 not found: ID does not exist" containerID="535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.466769 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608"} err="failed to get container status \"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608\": rpc error: code = NotFound desc = could not find container \"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608\": container with ID starting with 535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608 not found: ID does not exist" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.466795 4749 scope.go:117] "RemoveContainer" containerID="6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" Feb 25 07:39:08 crc kubenswrapper[4749]: E0225 07:39:08.466812 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-log" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.466835 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-log" Feb 25 07:39:08 crc kubenswrapper[4749]: E0225 07:39:08.466898 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-metadata" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.466922 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-metadata" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.467355 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-metadata" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.467396 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" containerName="nova-metadata-log" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.469056 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: E0225 07:39:08.470128 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d\": container with ID starting with 6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d not found: ID does not exist" containerID="6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.470178 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d"} err="failed to get container status \"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d\": rpc error: code = NotFound desc = could not find container \"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d\": container with ID starting with 6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d not found: ID does not exist" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.470203 4749 scope.go:117] "RemoveContainer" containerID="535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.474087 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608"} err="failed to get container status \"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608\": rpc error: code = NotFound desc = could not find container \"535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608\": container with ID starting with 535f435f575fa914139c067fbe01d065df3b795010cdd749e2663370419c0608 not found: ID does not exist" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.474128 4749 scope.go:117] "RemoveContainer" containerID="6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.474178 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.474320 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.474503 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d"} err="failed to get container status \"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d\": rpc error: code = NotFound desc = could not find container \"6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d\": container with ID starting with 6b1502c6d4bc49d281adb82327759769cc1df650ec973e6333ce9f3befbcac2d not found: ID does not exist" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.477264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.576114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.576178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.576199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.576220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.576312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhljl\" (UniqueName: \"kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.677685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.678668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.678700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.678733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.678894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhljl\" (UniqueName: \"kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.679170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.688493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.689505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.692450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.703435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhljl\" (UniqueName: \"kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl\") pod \"nova-metadata-0\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " pod="openstack/nova-metadata-0" Feb 25 07:39:08 crc kubenswrapper[4749]: I0225 07:39:08.789495 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:09 crc kubenswrapper[4749]: I0225 07:39:09.286993 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:09 crc kubenswrapper[4749]: W0225 07:39:09.297269 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b67464a_e011_4a72_9ffd_6db235913db5.slice/crio-40046c54a8facf53b129404b27a5fd8fe40e3b5952192bfe0acc2c6564b64653 WatchSource:0}: Error finding container 40046c54a8facf53b129404b27a5fd8fe40e3b5952192bfe0acc2c6564b64653: Status 404 returned error can't find the container with id 40046c54a8facf53b129404b27a5fd8fe40e3b5952192bfe0acc2c6564b64653 Feb 25 07:39:09 crc kubenswrapper[4749]: I0225 07:39:09.340684 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3fee2b8-17be-455a-a4dd-09aac96819d4" path="/var/lib/kubelet/pods/f3fee2b8-17be-455a-a4dd-09aac96819d4/volumes" Feb 25 07:39:09 crc kubenswrapper[4749]: I0225 07:39:09.396208 4749 generic.go:334] "Generic (PLEG): container finished" podID="58ee757a-870c-4c47-847c-bed7addddb21" containerID="1b127a38cf4d7012536adf90f4ac9dd659ec816d4b71abb427527ef47dcd37d3" exitCode=0 Feb 25 07:39:09 crc kubenswrapper[4749]: I0225 07:39:09.396286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n7jcl" event={"ID":"58ee757a-870c-4c47-847c-bed7addddb21","Type":"ContainerDied","Data":"1b127a38cf4d7012536adf90f4ac9dd659ec816d4b71abb427527ef47dcd37d3"} Feb 25 07:39:09 crc kubenswrapper[4749]: I0225 07:39:09.398623 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerStarted","Data":"40046c54a8facf53b129404b27a5fd8fe40e3b5952192bfe0acc2c6564b64653"} Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.440029 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b0b654d-70c1-4ea5-b508-1c365f26720a" containerID="1f08ac38d6d2521c0f4244d5bc80a4fe0909319c97570ac09afd06277a253b74" exitCode=0 Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.440582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" event={"ID":"4b0b654d-70c1-4ea5-b508-1c365f26720a","Type":"ContainerDied","Data":"1f08ac38d6d2521c0f4244d5bc80a4fe0909319c97570ac09afd06277a253b74"} Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.452016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerStarted","Data":"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96"} Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.452077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerStarted","Data":"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f"} Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.493032 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.49301371 podStartE2EDuration="2.49301371s" podCreationTimestamp="2026-02-25 07:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:10.482130658 +0000 UTC m=+1303.843956678" watchObservedRunningTime="2026-02-25 07:39:10.49301371 +0000 UTC m=+1303.854839730" Feb 25 07:39:10 crc kubenswrapper[4749]: I0225 07:39:10.891975 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.029009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data\") pod \"58ee757a-870c-4c47-847c-bed7addddb21\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.029117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle\") pod \"58ee757a-870c-4c47-847c-bed7addddb21\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.029213 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts\") pod \"58ee757a-870c-4c47-847c-bed7addddb21\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.029261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kjmn\" (UniqueName: \"kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn\") pod \"58ee757a-870c-4c47-847c-bed7addddb21\" (UID: \"58ee757a-870c-4c47-847c-bed7addddb21\") " Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.041425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn" (OuterVolumeSpecName: "kube-api-access-5kjmn") pod "58ee757a-870c-4c47-847c-bed7addddb21" (UID: "58ee757a-870c-4c47-847c-bed7addddb21"). InnerVolumeSpecName "kube-api-access-5kjmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.044414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts" (OuterVolumeSpecName: "scripts") pod "58ee757a-870c-4c47-847c-bed7addddb21" (UID: "58ee757a-870c-4c47-847c-bed7addddb21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.082803 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58ee757a-870c-4c47-847c-bed7addddb21" (UID: "58ee757a-870c-4c47-847c-bed7addddb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.087608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data" (OuterVolumeSpecName: "config-data") pod "58ee757a-870c-4c47-847c-bed7addddb21" (UID: "58ee757a-870c-4c47-847c-bed7addddb21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.131972 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.132311 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kjmn\" (UniqueName: \"kubernetes.io/projected/58ee757a-870c-4c47-847c-bed7addddb21-kube-api-access-5kjmn\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.132333 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.132352 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58ee757a-870c-4c47-847c-bed7addddb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.466011 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.475794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-n7jcl" event={"ID":"58ee757a-870c-4c47-847c-bed7addddb21","Type":"ContainerDied","Data":"1f7d1290be367755ab48946d50d5e980f631528039eec3e0bfab3b8fb3106682"} Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.475856 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f7d1290be367755ab48946d50d5e980f631528039eec3e0bfab3b8fb3106682" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.476097 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-n7jcl" Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.622348 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.622640 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-log" containerID="cri-o://025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" gracePeriod=30 Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.622896 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-api" containerID="cri-o://6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" gracePeriod=30 Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.650670 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.650843 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a6023724-dae0-4f04-8e64-e45d0dd22342" containerName="nova-scheduler-scheduler" containerID="cri-o://2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f" gracePeriod=30 Feb 25 07:39:11 crc kubenswrapper[4749]: I0225 07:39:11.669385 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.060112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.066743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.103197 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.123374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.124033 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.124251 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="dnsmasq-dns" containerID="cri-o://b3a693c2d99e58049e6ad8692fd7d56e9bf721d8ffe37eb100f0f8d1de232899" gracePeriod=10 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.156693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle\") pod \"4b0b654d-70c1-4ea5-b508-1c365f26720a\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.156802 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzm4g\" (UniqueName: \"kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g\") pod \"4b0b654d-70c1-4ea5-b508-1c365f26720a\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.156865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts\") pod \"4b0b654d-70c1-4ea5-b508-1c365f26720a\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.157020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data\") pod \"4b0b654d-70c1-4ea5-b508-1c365f26720a\" (UID: \"4b0b654d-70c1-4ea5-b508-1c365f26720a\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.165129 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts" (OuterVolumeSpecName: "scripts") pod "4b0b654d-70c1-4ea5-b508-1c365f26720a" (UID: "4b0b654d-70c1-4ea5-b508-1c365f26720a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.169793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g" (OuterVolumeSpecName: "kube-api-access-vzm4g") pod "4b0b654d-70c1-4ea5-b508-1c365f26720a" (UID: "4b0b654d-70c1-4ea5-b508-1c365f26720a"). InnerVolumeSpecName "kube-api-access-vzm4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.192091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data" (OuterVolumeSpecName: "config-data") pod "4b0b654d-70c1-4ea5-b508-1c365f26720a" (UID: "4b0b654d-70c1-4ea5-b508-1c365f26720a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.204498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b0b654d-70c1-4ea5-b508-1c365f26720a" (UID: "4b0b654d-70c1-4ea5-b508-1c365f26720a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.260404 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzm4g\" (UniqueName: \"kubernetes.io/projected/4b0b654d-70c1-4ea5-b508-1c365f26720a-kube-api-access-vzm4g\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.260432 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.260441 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.260450 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0b654d-70c1-4ea5-b508-1c365f26720a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.496757 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.497877 4749 generic.go:334] "Generic (PLEG): container finished" podID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerID="b3a693c2d99e58049e6ad8692fd7d56e9bf721d8ffe37eb100f0f8d1de232899" exitCode=0 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.497979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" event={"ID":"32cd89a4-78d4-4298-90b6-d854e9a35178","Type":"ContainerDied","Data":"b3a693c2d99e58049e6ad8692fd7d56e9bf721d8ffe37eb100f0f8d1de232899"} Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.501641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" event={"ID":"4b0b654d-70c1-4ea5-b508-1c365f26720a","Type":"ContainerDied","Data":"47954b8a3c15ad81aa124eca51afde8d4c621313cd27693eeac4162dbc94d106"} Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.501670 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47954b8a3c15ad81aa124eca51afde8d4c621313cd27693eeac4162dbc94d106" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.501714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xxh4" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512636 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerID="6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" exitCode=0 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512660 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerID="025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" exitCode=143 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512786 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-log" containerID="cri-o://4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" gracePeriod=30 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerDied","Data":"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2"} Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512860 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerDied","Data":"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c"} Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263","Type":"ContainerDied","Data":"a0576409009276195aa64204901c98faca57d82290117030bc5ad938802e25d5"} Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.512933 4749 scope.go:117] "RemoveContainer" containerID="6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.513116 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-metadata" containerID="cri-o://41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" gracePeriod=30 Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.552588 4749 scope.go:117] "RemoveContainer" containerID="025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.565580 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data\") pod \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.565864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs\") pod \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.566186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs" (OuterVolumeSpecName: "logs") pod "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" (UID: "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.566436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-przbw\" (UniqueName: \"kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw\") pod \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.567087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle\") pod \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\" (UID: \"0e64b9e0-c5c2-4b11-a6dc-72fffdc15263\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.567782 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.568626 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.569033 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0b654d-70c1-4ea5-b508-1c365f26720a" containerName="nova-cell1-conductor-db-sync" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.569051 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0b654d-70c1-4ea5-b508-1c365f26720a" containerName="nova-cell1-conductor-db-sync" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.569069 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ee757a-870c-4c47-847c-bed7addddb21" containerName="nova-manage" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.569078 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ee757a-870c-4c47-847c-bed7addddb21" containerName="nova-manage" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.569088 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-api" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.569096 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-api" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.569109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-log" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.569118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-log" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.569916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw" (OuterVolumeSpecName: "kube-api-access-przbw") pod "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" (UID: "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263"). InnerVolumeSpecName "kube-api-access-przbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.595138 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-api" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.595180 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" containerName="nova-api-log" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.595202 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ee757a-870c-4c47-847c-bed7addddb21" containerName="nova-manage" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.595216 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0b654d-70c1-4ea5-b508-1c365f26720a" containerName="nova-cell1-conductor-db-sync" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.595944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.598015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.600328 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.602858 4749 scope.go:117] "RemoveContainer" containerID="6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.606726 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2\": container with ID starting with 6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2 not found: ID does not exist" containerID="6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.606756 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2"} err="failed to get container status \"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2\": rpc error: code = NotFound desc = could not find container \"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2\": container with ID starting with 6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2 not found: ID does not exist" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.606773 4749 scope.go:117] "RemoveContainer" containerID="025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.606891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" (UID: "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.607143 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c\": container with ID starting with 025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c not found: ID does not exist" containerID="025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.607162 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c"} err="failed to get container status \"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c\": rpc error: code = NotFound desc = could not find container \"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c\": container with ID starting with 025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c not found: ID does not exist" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.607174 4749 scope.go:117] "RemoveContainer" containerID="6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.607566 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2"} err="failed to get container status \"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2\": rpc error: code = NotFound desc = could not find container \"6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2\": container with ID starting with 6c0c8f3508087c9a0ad01284952f815fa205c7ae760464f1a6b36483e59b1ac2 not found: ID does not exist" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.607613 4749 scope.go:117] "RemoveContainer" containerID="025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.608156 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c"} err="failed to get container status \"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c\": rpc error: code = NotFound desc = could not find container \"025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c\": container with ID starting with 025ea7d9c67fe9760edbec031693165af1a733a9c024d6d34ce4c70016b56d1c not found: ID does not exist" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.611244 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data" (OuterVolumeSpecName: "config-data") pod "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" (UID: "0e64b9e0-c5c2-4b11-a6dc-72fffdc15263"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669120 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lwlf\" (UniqueName: \"kubernetes.io/projected/e800eb5f-0979-4ad3-8f0a-1adab77e1259-kube-api-access-7lwlf\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669439 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669457 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669491 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.669505 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-przbw\" (UniqueName: \"kubernetes.io/projected/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263-kube-api-access-przbw\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771773 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb5vv\" (UniqueName: \"kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.771962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb\") pod \"32cd89a4-78d4-4298-90b6-d854e9a35178\" (UID: \"32cd89a4-78d4-4298-90b6-d854e9a35178\") " Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.772385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.772525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lwlf\" (UniqueName: \"kubernetes.io/projected/e800eb5f-0979-4ad3-8f0a-1adab77e1259-kube-api-access-7lwlf\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.772552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.779484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.780327 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv" (OuterVolumeSpecName: "kube-api-access-tb5vv") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "kube-api-access-tb5vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.789950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e800eb5f-0979-4ad3-8f0a-1adab77e1259-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.790219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lwlf\" (UniqueName: \"kubernetes.io/projected/e800eb5f-0979-4ad3-8f0a-1adab77e1259-kube-api-access-7lwlf\") pod \"nova-cell1-conductor-0\" (UID: \"e800eb5f-0979-4ad3-8f0a-1adab77e1259\") " pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.823902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.824106 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config" (OuterVolumeSpecName: "config") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.842307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.844908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.863920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32cd89a4-78d4-4298-90b6-d854e9a35178" (UID: "32cd89a4-78d4-4298-90b6-d854e9a35178"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.873220 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881272 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881310 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881322 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb5vv\" (UniqueName: \"kubernetes.io/projected/32cd89a4-78d4-4298-90b6-d854e9a35178-kube-api-access-tb5vv\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881341 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881356 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.881366 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32cd89a4-78d4-4298-90b6-d854e9a35178-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.900527 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.919339 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.920192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="init" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.920224 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="init" Feb 25 07:39:12 crc kubenswrapper[4749]: E0225 07:39:12.920249 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="dnsmasq-dns" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.920258 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="dnsmasq-dns" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.920501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" containerName="dnsmasq-dns" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.922056 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.930656 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.932246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.967226 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.987703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.987875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7qf\" (UniqueName: \"kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.987904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:12 crc kubenswrapper[4749]: I0225 07:39:12.987994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.078090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.090543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.090663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.090946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7qf\" (UniqueName: \"kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.090981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.092098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.103308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.114218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.122579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7qf\" (UniqueName: \"kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf\") pod \"nova-api-0\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.192384 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhljl\" (UniqueName: \"kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl\") pod \"3b67464a-e011-4a72-9ffd-6db235913db5\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.192521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data\") pod \"3b67464a-e011-4a72-9ffd-6db235913db5\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.193070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs\") pod \"3b67464a-e011-4a72-9ffd-6db235913db5\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.193140 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs\") pod \"3b67464a-e011-4a72-9ffd-6db235913db5\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.193169 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle\") pod \"3b67464a-e011-4a72-9ffd-6db235913db5\" (UID: \"3b67464a-e011-4a72-9ffd-6db235913db5\") " Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.197439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs" (OuterVolumeSpecName: "logs") pod "3b67464a-e011-4a72-9ffd-6db235913db5" (UID: "3b67464a-e011-4a72-9ffd-6db235913db5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.202820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl" (OuterVolumeSpecName: "kube-api-access-lhljl") pod "3b67464a-e011-4a72-9ffd-6db235913db5" (UID: "3b67464a-e011-4a72-9ffd-6db235913db5"). InnerVolumeSpecName "kube-api-access-lhljl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.236783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b67464a-e011-4a72-9ffd-6db235913db5" (UID: "3b67464a-e011-4a72-9ffd-6db235913db5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.246338 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data" (OuterVolumeSpecName: "config-data") pod "3b67464a-e011-4a72-9ffd-6db235913db5" (UID: "3b67464a-e011-4a72-9ffd-6db235913db5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.260684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.277646 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3b67464a-e011-4a72-9ffd-6db235913db5" (UID: "3b67464a-e011-4a72-9ffd-6db235913db5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.295540 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b67464a-e011-4a72-9ffd-6db235913db5-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.295629 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.295649 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.295663 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhljl\" (UniqueName: \"kubernetes.io/projected/3b67464a-e011-4a72-9ffd-6db235913db5-kube-api-access-lhljl\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.295673 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b67464a-e011-4a72-9ffd-6db235913db5-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.336242 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e64b9e0-c5c2-4b11-a6dc-72fffdc15263" path="/var/lib/kubelet/pods/0e64b9e0-c5c2-4b11-a6dc-72fffdc15263/volumes" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.474922 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: W0225 07:39:13.485107 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode800eb5f_0979_4ad3_8f0a_1adab77e1259.slice/crio-770efbe72eeceedec688922d55525f33da20acc02de2a3737f3e1646b2764479 WatchSource:0}: Error finding container 770efbe72eeceedec688922d55525f33da20acc02de2a3737f3e1646b2764479: Status 404 returned error can't find the container with id 770efbe72eeceedec688922d55525f33da20acc02de2a3737f3e1646b2764479 Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.526746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" event={"ID":"32cd89a4-78d4-4298-90b6-d854e9a35178","Type":"ContainerDied","Data":"3e69e8b6780422d97d026bb5587bcfb8008557952477dd4bf63e1ed3384e3338"} Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.526792 4749 scope.go:117] "RemoveContainer" containerID="b3a693c2d99e58049e6ad8692fd7d56e9bf721d8ffe37eb100f0f8d1de232899" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.526927 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-9wvml" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535471 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b67464a-e011-4a72-9ffd-6db235913db5" containerID="41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" exitCode=0 Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535495 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b67464a-e011-4a72-9ffd-6db235913db5" containerID="4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" exitCode=143 Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerDied","Data":"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96"} Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerDied","Data":"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f"} Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3b67464a-e011-4a72-9ffd-6db235913db5","Type":"ContainerDied","Data":"40046c54a8facf53b129404b27a5fd8fe40e3b5952192bfe0acc2c6564b64653"} Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.535701 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.540143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e800eb5f-0979-4ad3-8f0a-1adab77e1259","Type":"ContainerStarted","Data":"770efbe72eeceedec688922d55525f33da20acc02de2a3737f3e1646b2764479"} Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.576715 4749 scope.go:117] "RemoveContainer" containerID="bedceda6e587ee5b9a0de07ac8ba3359846cbeefd382ad880bf66c7b360276ce" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.613282 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.616702 4749 scope.go:117] "RemoveContainer" containerID="41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.632103 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.661261 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.673542 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-9wvml"] Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.679187 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: E0225 07:39:13.679572 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-metadata" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.679586 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-metadata" Feb 25 07:39:13 crc kubenswrapper[4749]: E0225 07:39:13.679616 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-log" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.679622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-log" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.679796 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-log" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.679808 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" containerName="nova-metadata-metadata" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.680815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.684276 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.684547 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.688509 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.694092 4749 scope.go:117] "RemoveContainer" containerID="4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.732197 4749 scope.go:117] "RemoveContainer" containerID="41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.732756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:13 crc kubenswrapper[4749]: E0225 07:39:13.733173 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96\": container with ID starting with 41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96 not found: ID does not exist" containerID="41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733211 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96"} err="failed to get container status \"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96\": rpc error: code = NotFound desc = could not find container \"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96\": container with ID starting with 41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96 not found: ID does not exist" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733234 4749 scope.go:117] "RemoveContainer" containerID="4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" Feb 25 07:39:13 crc kubenswrapper[4749]: E0225 07:39:13.733490 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f\": container with ID starting with 4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f not found: ID does not exist" containerID="4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733509 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f"} err="failed to get container status \"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f\": rpc error: code = NotFound desc = could not find container \"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f\": container with ID starting with 4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f not found: ID does not exist" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733522 4749 scope.go:117] "RemoveContainer" containerID="41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733862 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96"} err="failed to get container status \"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96\": rpc error: code = NotFound desc = could not find container \"41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96\": container with ID starting with 41804ead0a83303680149a4d5dcb4a74b0df769284a1656fd24d2adaf6f49f96 not found: ID does not exist" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.733882 4749 scope.go:117] "RemoveContainer" containerID="4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.734142 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f"} err="failed to get container status \"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f\": rpc error: code = NotFound desc = could not find container \"4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f\": container with ID starting with 4efe4a51ed0031ba9a98218a3594db6900de3e05c6cc2091079271899581300f not found: ID does not exist" Feb 25 07:39:13 crc kubenswrapper[4749]: W0225 07:39:13.736864 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc44b42a6_ec9f_4317_b25d_d004f9b2fa2e.slice/crio-b6f4e1aca283b424660ad1ebc6763fde21992b749fc859cf12cddea9d0377ce5 WatchSource:0}: Error finding container b6f4e1aca283b424660ad1ebc6763fde21992b749fc859cf12cddea9d0377ce5: Status 404 returned error can't find the container with id b6f4e1aca283b424660ad1ebc6763fde21992b749fc859cf12cddea9d0377ce5 Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.820669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.820745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.820802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.820870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjl6m\" (UniqueName: \"kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.820893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.925999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjl6m\" (UniqueName: \"kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.926100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.926225 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.926298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.926377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.926768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.934894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.935385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.936976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.954068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjl6m\" (UniqueName: \"kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m\") pod \"nova-metadata-0\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " pod="openstack/nova-metadata-0" Feb 25 07:39:13 crc kubenswrapper[4749]: I0225 07:39:13.994762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.077214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.241452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle\") pod \"a6023724-dae0-4f04-8e64-e45d0dd22342\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.241517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r59f\" (UniqueName: \"kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f\") pod \"a6023724-dae0-4f04-8e64-e45d0dd22342\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.241765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data\") pod \"a6023724-dae0-4f04-8e64-e45d0dd22342\" (UID: \"a6023724-dae0-4f04-8e64-e45d0dd22342\") " Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.250191 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f" (OuterVolumeSpecName: "kube-api-access-7r59f") pod "a6023724-dae0-4f04-8e64-e45d0dd22342" (UID: "a6023724-dae0-4f04-8e64-e45d0dd22342"). InnerVolumeSpecName "kube-api-access-7r59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.280718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6023724-dae0-4f04-8e64-e45d0dd22342" (UID: "a6023724-dae0-4f04-8e64-e45d0dd22342"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.293423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data" (OuterVolumeSpecName: "config-data") pod "a6023724-dae0-4f04-8e64-e45d0dd22342" (UID: "a6023724-dae0-4f04-8e64-e45d0dd22342"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.344380 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.344411 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r59f\" (UniqueName: \"kubernetes.io/projected/a6023724-dae0-4f04-8e64-e45d0dd22342-kube-api-access-7r59f\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.344423 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6023724-dae0-4f04-8e64-e45d0dd22342-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.509072 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.552546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerStarted","Data":"fbde5e2d33077ee2ec05cdab5c8691e53052f9ed34d034895deb7261a6707df6"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.555405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerStarted","Data":"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.555431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerStarted","Data":"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.555441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerStarted","Data":"b6f4e1aca283b424660ad1ebc6763fde21992b749fc859cf12cddea9d0377ce5"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.557320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e800eb5f-0979-4ad3-8f0a-1adab77e1259","Type":"ContainerStarted","Data":"cf770e819af43b9e8263a0d6d019c69450373dae92424a10c71ab08c3a8f482c"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.557715 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.559027 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6023724-dae0-4f04-8e64-e45d0dd22342" containerID="2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f" exitCode=0 Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.559055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6023724-dae0-4f04-8e64-e45d0dd22342","Type":"ContainerDied","Data":"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.559071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6023724-dae0-4f04-8e64-e45d0dd22342","Type":"ContainerDied","Data":"2dc9bb563d6477bf9785ac351c773da4e8d13edbc922c155795c2ae27815e857"} Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.559087 4749 scope.go:117] "RemoveContainer" containerID="2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.559154 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.578761 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.578546587 podStartE2EDuration="2.578546587s" podCreationTimestamp="2026-02-25 07:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:14.569457288 +0000 UTC m=+1307.931283308" watchObservedRunningTime="2026-02-25 07:39:14.578546587 +0000 UTC m=+1307.940372607" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.595741 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.595717962 podStartE2EDuration="2.595717962s" podCreationTimestamp="2026-02-25 07:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:14.585675899 +0000 UTC m=+1307.947501919" watchObservedRunningTime="2026-02-25 07:39:14.595717962 +0000 UTC m=+1307.957543992" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.608221 4749 scope.go:117] "RemoveContainer" containerID="2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f" Feb 25 07:39:14 crc kubenswrapper[4749]: E0225 07:39:14.608864 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f\": container with ID starting with 2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f not found: ID does not exist" containerID="2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.608900 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f"} err="failed to get container status \"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f\": rpc error: code = NotFound desc = could not find container \"2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f\": container with ID starting with 2f89ee01416863c2ca9766f1a537a10e584821b222795ecfd87121357860738f not found: ID does not exist" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.619233 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.633851 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.676477 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:14 crc kubenswrapper[4749]: E0225 07:39:14.678617 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6023724-dae0-4f04-8e64-e45d0dd22342" containerName="nova-scheduler-scheduler" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.678815 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6023724-dae0-4f04-8e64-e45d0dd22342" containerName="nova-scheduler-scheduler" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.680072 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6023724-dae0-4f04-8e64-e45d0dd22342" containerName="nova-scheduler-scheduler" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.681650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.685250 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.700928 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.753891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.754218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44f6k\" (UniqueName: \"kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.754259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.855473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.855539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44f6k\" (UniqueName: \"kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.855578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.858725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.858748 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:14 crc kubenswrapper[4749]: I0225 07:39:14.871316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44f6k\" (UniqueName: \"kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k\") pod \"nova-scheduler-0\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.030343 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.332035 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32cd89a4-78d4-4298-90b6-d854e9a35178" path="/var/lib/kubelet/pods/32cd89a4-78d4-4298-90b6-d854e9a35178/volumes" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.332850 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b67464a-e011-4a72-9ffd-6db235913db5" path="/var/lib/kubelet/pods/3b67464a-e011-4a72-9ffd-6db235913db5/volumes" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.333362 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6023724-dae0-4f04-8e64-e45d0dd22342" path="/var/lib/kubelet/pods/a6023724-dae0-4f04-8e64-e45d0dd22342/volumes" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.554993 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.580029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerStarted","Data":"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de"} Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.580077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerStarted","Data":"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db"} Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.585130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"426a8ace-3a5f-4695-8f7f-0710b1418b00","Type":"ContainerStarted","Data":"60a7a63df348b0a68297f147be6c3a63a62fb3b8f70a85c7181e474fb64bf935"} Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.615666 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.615641228 podStartE2EDuration="2.615641228s" podCreationTimestamp="2026-02-25 07:39:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:15.60369469 +0000 UTC m=+1308.965520710" watchObservedRunningTime="2026-02-25 07:39:15.615641228 +0000 UTC m=+1308.977467248" Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.853873 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:15 crc kubenswrapper[4749]: I0225 07:39:15.854274 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" containerName="kube-state-metrics" containerID="cri-o://5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391" gracePeriod=30 Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.385209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.482559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxr6\" (UniqueName: \"kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6\") pod \"54b96c95-7c89-40c7-a51c-6a5c4c59a036\" (UID: \"54b96c95-7c89-40c7-a51c-6a5c4c59a036\") " Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.491806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6" (OuterVolumeSpecName: "kube-api-access-8dxr6") pod "54b96c95-7c89-40c7-a51c-6a5c4c59a036" (UID: "54b96c95-7c89-40c7-a51c-6a5c4c59a036"). InnerVolumeSpecName "kube-api-access-8dxr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.584991 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dxr6\" (UniqueName: \"kubernetes.io/projected/54b96c95-7c89-40c7-a51c-6a5c4c59a036-kube-api-access-8dxr6\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.593648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"426a8ace-3a5f-4695-8f7f-0710b1418b00","Type":"ContainerStarted","Data":"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4"} Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.598272 4749 generic.go:334] "Generic (PLEG): container finished" podID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" containerID="5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391" exitCode=2 Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.600124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.600647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54b96c95-7c89-40c7-a51c-6a5c4c59a036","Type":"ContainerDied","Data":"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391"} Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.600705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"54b96c95-7c89-40c7-a51c-6a5c4c59a036","Type":"ContainerDied","Data":"53594703c8f052292af4dc3a4af1d5b822faa7d949e2e16580a536ca5a8aa8f7"} Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.600730 4749 scope.go:117] "RemoveContainer" containerID="5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.621434 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.621409482 podStartE2EDuration="2.621409482s" podCreationTimestamp="2026-02-25 07:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:16.617938749 +0000 UTC m=+1309.979764779" watchObservedRunningTime="2026-02-25 07:39:16.621409482 +0000 UTC m=+1309.983235522" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.641223 4749 scope.go:117] "RemoveContainer" containerID="5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391" Feb 25 07:39:16 crc kubenswrapper[4749]: E0225 07:39:16.644105 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391\": container with ID starting with 5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391 not found: ID does not exist" containerID="5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.644153 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391"} err="failed to get container status \"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391\": rpc error: code = NotFound desc = could not find container \"5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391\": container with ID starting with 5eaed9f6623b9b0ecf13cdc8713008c507c489af8f0fada0185d242654f4b391 not found: ID does not exist" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.656441 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.664418 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.672679 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:16 crc kubenswrapper[4749]: E0225 07:39:16.673111 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" containerName="kube-state-metrics" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.673122 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" containerName="kube-state-metrics" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.673282 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" containerName="kube-state-metrics" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.673974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.678168 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.678288 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.684442 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.801806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.801863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.801943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkwf\" (UniqueName: \"kubernetes.io/projected/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-api-access-fbkwf\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.802074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.903156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.903238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.903254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.903291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkwf\" (UniqueName: \"kubernetes.io/projected/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-api-access-fbkwf\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.908031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.908315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.908741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:16 crc kubenswrapper[4749]: I0225 07:39:16.937969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkwf\" (UniqueName: \"kubernetes.io/projected/11165cb9-7a9a-425b-8eea-42e61a784a57-kube-api-access-fbkwf\") pod \"kube-state-metrics-0\" (UID: \"11165cb9-7a9a-425b-8eea-42e61a784a57\") " pod="openstack/kube-state-metrics-0" Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.007642 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.338063 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b96c95-7c89-40c7-a51c-6a5c4c59a036" path="/var/lib/kubelet/pods/54b96c95-7c89-40c7-a51c-6a5c4c59a036/volumes" Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.452004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.612279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11165cb9-7a9a-425b-8eea-42e61a784a57","Type":"ContainerStarted","Data":"be5f564f62506735d04eb482459cfdc7537944eaaed9fbaa38d2da6ec5bdb922"} Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.777906 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.778168 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-central-agent" containerID="cri-o://576a35bf4d501a7bd5a5abcead4e116551289c5ebc27021fb0bea3ee9d50dd57" gracePeriod=30 Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.778230 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="proxy-httpd" containerID="cri-o://ad78b9fabf8821ac2dcf29ade5240ce787b153daf923955933c705ddbaec3df0" gracePeriod=30 Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.778291 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-notification-agent" containerID="cri-o://b8221002f7ca405abe79eb80840e7b865672e8a54af9d0f084955e497e58bccc" gracePeriod=30 Feb 25 07:39:17 crc kubenswrapper[4749]: I0225 07:39:17.778252 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="sg-core" containerID="cri-o://004b9bad1ef9bdb43eda8a4e624740d0c1a80e06e69bf9c7a90fd7c8d9a87b9d" gracePeriod=30 Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.629423 4749 generic.go:334] "Generic (PLEG): container finished" podID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerID="ad78b9fabf8821ac2dcf29ade5240ce787b153daf923955933c705ddbaec3df0" exitCode=0 Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.630055 4749 generic.go:334] "Generic (PLEG): container finished" podID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerID="004b9bad1ef9bdb43eda8a4e624740d0c1a80e06e69bf9c7a90fd7c8d9a87b9d" exitCode=2 Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.630073 4749 generic.go:334] "Generic (PLEG): container finished" podID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerID="576a35bf4d501a7bd5a5abcead4e116551289c5ebc27021fb0bea3ee9d50dd57" exitCode=0 Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.629486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerDied","Data":"ad78b9fabf8821ac2dcf29ade5240ce787b153daf923955933c705ddbaec3df0"} Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.630151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerDied","Data":"004b9bad1ef9bdb43eda8a4e624740d0c1a80e06e69bf9c7a90fd7c8d9a87b9d"} Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.630189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerDied","Data":"576a35bf4d501a7bd5a5abcead4e116551289c5ebc27021fb0bea3ee9d50dd57"} Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.631979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"11165cb9-7a9a-425b-8eea-42e61a784a57","Type":"ContainerStarted","Data":"03c25680f7febd882db8137b9f151b27cf4290c6fedc66000c3bd72891f88aa1"} Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.632160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.653037 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.282792412 podStartE2EDuration="2.653011516s" podCreationTimestamp="2026-02-25 07:39:16 +0000 UTC" firstStartedPulling="2026-02-25 07:39:17.458746878 +0000 UTC m=+1310.820572898" lastFinishedPulling="2026-02-25 07:39:17.828965982 +0000 UTC m=+1311.190792002" observedRunningTime="2026-02-25 07:39:18.647110484 +0000 UTC m=+1312.008936514" watchObservedRunningTime="2026-02-25 07:39:18.653011516 +0000 UTC m=+1312.014837536" Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.995803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 07:39:18 crc kubenswrapper[4749]: I0225 07:39:18.995869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.653060 4749 generic.go:334] "Generic (PLEG): container finished" podID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerID="b8221002f7ca405abe79eb80840e7b865672e8a54af9d0f084955e497e58bccc" exitCode=0 Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.653736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerDied","Data":"b8221002f7ca405abe79eb80840e7b865672e8a54af9d0f084955e497e58bccc"} Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.889745 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln6dw\" (UniqueName: \"kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958802 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.958851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data\") pod \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\" (UID: \"f55fe179-96dd-4ed7-8a62-5e3ca048200f\") " Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.960668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.964324 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.965070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw" (OuterVolumeSpecName: "kube-api-access-ln6dw") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "kube-api-access-ln6dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:19 crc kubenswrapper[4749]: I0225 07:39:19.979923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts" (OuterVolumeSpecName: "scripts") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.009396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.031644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.060753 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.060783 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.060793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln6dw\" (UniqueName: \"kubernetes.io/projected/f55fe179-96dd-4ed7-8a62-5e3ca048200f-kube-api-access-ln6dw\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.060803 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.060811 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f55fe179-96dd-4ed7-8a62-5e3ca048200f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.067292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data" (OuterVolumeSpecName: "config-data") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.080751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f55fe179-96dd-4ed7-8a62-5e3ca048200f" (UID: "f55fe179-96dd-4ed7-8a62-5e3ca048200f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.163105 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.163137 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f55fe179-96dd-4ed7-8a62-5e3ca048200f-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.667817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f55fe179-96dd-4ed7-8a62-5e3ca048200f","Type":"ContainerDied","Data":"2f4e6b885255df45ade2772b416af5a398a52ee497540d58713501231605a449"} Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.667890 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.668180 4749 scope.go:117] "RemoveContainer" containerID="ad78b9fabf8821ac2dcf29ade5240ce787b153daf923955933c705ddbaec3df0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.693733 4749 scope.go:117] "RemoveContainer" containerID="004b9bad1ef9bdb43eda8a4e624740d0c1a80e06e69bf9c7a90fd7c8d9a87b9d" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.716123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.727804 4749 scope.go:117] "RemoveContainer" containerID="b8221002f7ca405abe79eb80840e7b865672e8a54af9d0f084955e497e58bccc" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.737171 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.750515 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:20 crc kubenswrapper[4749]: E0225 07:39:20.751256 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="sg-core" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.751363 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="sg-core" Feb 25 07:39:20 crc kubenswrapper[4749]: E0225 07:39:20.751461 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="proxy-httpd" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.751536 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="proxy-httpd" Feb 25 07:39:20 crc kubenswrapper[4749]: E0225 07:39:20.751665 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-central-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.751744 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-central-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: E0225 07:39:20.751831 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-notification-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.751908 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-notification-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.752194 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="sg-core" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.752274 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="proxy-httpd" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.752338 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-central-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.752423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" containerName="ceilometer-notification-agent" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.754525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.757641 4749 scope.go:117] "RemoveContainer" containerID="576a35bf4d501a7bd5a5abcead4e116551289c5ebc27021fb0bea3ee9d50dd57" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.759888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.760015 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.760127 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.765127 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.877838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.877905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.877966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.878002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.878028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.878073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.878099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzbl\" (UniqueName: \"kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.878121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzbl\" (UniqueName: \"kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.980854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.981039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.981831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.985123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.985770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.986325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.997370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:20 crc kubenswrapper[4749]: I0225 07:39:20.998341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzbl\" (UniqueName: \"kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:21 crc kubenswrapper[4749]: I0225 07:39:21.001670 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " pod="openstack/ceilometer-0" Feb 25 07:39:21 crc kubenswrapper[4749]: I0225 07:39:21.094764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:21 crc kubenswrapper[4749]: I0225 07:39:21.334630 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55fe179-96dd-4ed7-8a62-5e3ca048200f" path="/var/lib/kubelet/pods/f55fe179-96dd-4ed7-8a62-5e3ca048200f/volumes" Feb 25 07:39:22 crc kubenswrapper[4749]: I0225 07:39:22.345701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:22 crc kubenswrapper[4749]: W0225 07:39:22.359072 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd427e4d_d068_482b_bbd1_d71916608054.slice/crio-fcf5bf3bd534a214fa7819b43587b62b7e5c3a62107deadb05cdcdb573760b46 WatchSource:0}: Error finding container fcf5bf3bd534a214fa7819b43587b62b7e5c3a62107deadb05cdcdb573760b46: Status 404 returned error can't find the container with id fcf5bf3bd534a214fa7819b43587b62b7e5c3a62107deadb05cdcdb573760b46 Feb 25 07:39:22 crc kubenswrapper[4749]: I0225 07:39:22.708310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerStarted","Data":"fcf5bf3bd534a214fa7819b43587b62b7e5c3a62107deadb05cdcdb573760b46"} Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.017994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.261606 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.261690 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.739772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerStarted","Data":"8d270c819cb2df8bbda86ed33bcba2e36abe9c4f47f29052de47d04b43bd218c"} Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.740139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerStarted","Data":"77b36762d92e201fe16de9a8b70191ed1b561797551f3eab10d13ff4945450de"} Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.995760 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 07:39:23 crc kubenswrapper[4749]: I0225 07:39:23.995823 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 07:39:24 crc kubenswrapper[4749]: I0225 07:39:24.343796 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:24 crc kubenswrapper[4749]: I0225 07:39:24.343796 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:24 crc kubenswrapper[4749]: I0225 07:39:24.771185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerStarted","Data":"64d75d755e7515e0722d3c224c29aeb55c695c3d4e5c6a67676544802d444e35"} Feb 25 07:39:25 crc kubenswrapper[4749]: I0225 07:39:25.012880 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:25 crc kubenswrapper[4749]: I0225 07:39:25.012910 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:25 crc kubenswrapper[4749]: I0225 07:39:25.031400 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 07:39:25 crc kubenswrapper[4749]: I0225 07:39:25.060219 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 07:39:25 crc kubenswrapper[4749]: I0225 07:39:25.833218 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 07:39:26 crc kubenswrapper[4749]: I0225 07:39:26.794828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerStarted","Data":"5fb093357ddca1dd56146dfb176f613a9599438d3c987ab578dfe09fee5fe988"} Feb 25 07:39:26 crc kubenswrapper[4749]: I0225 07:39:26.795140 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:39:26 crc kubenswrapper[4749]: I0225 07:39:26.831765 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.427432393 podStartE2EDuration="6.831741135s" podCreationTimestamp="2026-02-25 07:39:20 +0000 UTC" firstStartedPulling="2026-02-25 07:39:22.361724201 +0000 UTC m=+1315.723550231" lastFinishedPulling="2026-02-25 07:39:25.766032923 +0000 UTC m=+1319.127858973" observedRunningTime="2026-02-25 07:39:26.819578541 +0000 UTC m=+1320.181404561" watchObservedRunningTime="2026-02-25 07:39:26.831741135 +0000 UTC m=+1320.193567165" Feb 25 07:39:27 crc kubenswrapper[4749]: I0225 07:39:27.020687 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.265818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.266987 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.268459 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.269869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.854084 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 07:39:33 crc kubenswrapper[4749]: I0225 07:39:33.857686 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.033480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.033763 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.037029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.051387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.054709 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.058700 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.223983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.224038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.224257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.224522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.224570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.224690 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nhl\" (UniqueName: \"kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.326803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nhl\" (UniqueName: \"kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.328235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.328510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.328586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.328663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.329304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.345092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nhl\" (UniqueName: \"kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl\") pod \"dnsmasq-dns-cd5cbd7b9-msfrz\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.364918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.873275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 07:39:34 crc kubenswrapper[4749]: I0225 07:39:34.894000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:39:35 crc kubenswrapper[4749]: I0225 07:39:35.872823 4749 generic.go:334] "Generic (PLEG): container finished" podID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerID="cdf7247bc56f70baa999ff5bef0403ad8f1ae80ffe68f294f531558cd8b11eca" exitCode=0 Feb 25 07:39:35 crc kubenswrapper[4749]: I0225 07:39:35.872920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" event={"ID":"52a97588-ac3f-4ee4-8f8b-43f305bf0392","Type":"ContainerDied","Data":"cdf7247bc56f70baa999ff5bef0403ad8f1ae80ffe68f294f531558cd8b11eca"} Feb 25 07:39:35 crc kubenswrapper[4749]: I0225 07:39:35.873175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" event={"ID":"52a97588-ac3f-4ee4-8f8b-43f305bf0392","Type":"ContainerStarted","Data":"3f4c78d88205c0a635cb935f272d8aac547b7c3cdda561ecd820c3c5de690713"} Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.098011 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.098279 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-central-agent" containerID="cri-o://77b36762d92e201fe16de9a8b70191ed1b561797551f3eab10d13ff4945450de" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.098397 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="proxy-httpd" containerID="cri-o://5fb093357ddca1dd56146dfb176f613a9599438d3c987ab578dfe09fee5fe988" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.098442 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="sg-core" containerID="cri-o://64d75d755e7515e0722d3c224c29aeb55c695c3d4e5c6a67676544802d444e35" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.098471 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-notification-agent" containerID="cri-o://8d270c819cb2df8bbda86ed33bcba2e36abe9c4f47f29052de47d04b43bd218c" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.117208 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.495548 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884259 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd427e4d-d068-482b-bbd1-d71916608054" containerID="5fb093357ddca1dd56146dfb176f613a9599438d3c987ab578dfe09fee5fe988" exitCode=0 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884485 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd427e4d-d068-482b-bbd1-d71916608054" containerID="64d75d755e7515e0722d3c224c29aeb55c695c3d4e5c6a67676544802d444e35" exitCode=2 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884512 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd427e4d-d068-482b-bbd1-d71916608054" containerID="77b36762d92e201fe16de9a8b70191ed1b561797551f3eab10d13ff4945450de" exitCode=0 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerDied","Data":"5fb093357ddca1dd56146dfb176f613a9599438d3c987ab578dfe09fee5fe988"} Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerDied","Data":"64d75d755e7515e0722d3c224c29aeb55c695c3d4e5c6a67676544802d444e35"} Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.884583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerDied","Data":"77b36762d92e201fe16de9a8b70191ed1b561797551f3eab10d13ff4945450de"} Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.887942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" event={"ID":"52a97588-ac3f-4ee4-8f8b-43f305bf0392","Type":"ContainerStarted","Data":"3062fe4ac8bc383f5e95dfc646935dd4a31508dceb833a33cad1010a456cc631"} Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.888072 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-log" containerID="cri-o://0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.888255 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.888306 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-api" containerID="cri-o://ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4" gracePeriod=30 Feb 25 07:39:36 crc kubenswrapper[4749]: I0225 07:39:36.922486 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" podStartSLOduration=2.922467237 podStartE2EDuration="2.922467237s" podCreationTimestamp="2026-02-25 07:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:36.919095514 +0000 UTC m=+1330.280921554" watchObservedRunningTime="2026-02-25 07:39:36.922467237 +0000 UTC m=+1330.284293257" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.833108 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.897446 4749 generic.go:334] "Generic (PLEG): container finished" podID="6b424e62-82ef-4540-b049-c673cce6de0e" containerID="3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d" exitCode=137 Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.897515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.897555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b424e62-82ef-4540-b049-c673cce6de0e","Type":"ContainerDied","Data":"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d"} Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.897611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b424e62-82ef-4540-b049-c673cce6de0e","Type":"ContainerDied","Data":"44bfff790476ccceeefe0b8164c6ebe5098bffa10e8ca44373dd4a4d6a18472a"} Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.897628 4749 scope.go:117] "RemoveContainer" containerID="3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.901341 4749 generic.go:334] "Generic (PLEG): container finished" podID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerID="0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c" exitCode=143 Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.901834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerDied","Data":"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c"} Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.923567 4749 scope.go:117] "RemoveContainer" containerID="3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d" Feb 25 07:39:37 crc kubenswrapper[4749]: E0225 07:39:37.924325 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d\": container with ID starting with 3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d not found: ID does not exist" containerID="3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.924364 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d"} err="failed to get container status \"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d\": rpc error: code = NotFound desc = could not find container \"3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d\": container with ID starting with 3804501c749d213631104532b903a9e8c929d9dc9ef6493e62a30e42bca5d14d not found: ID does not exist" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.926208 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle\") pod \"6b424e62-82ef-4540-b049-c673cce6de0e\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.926327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l98rj\" (UniqueName: \"kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj\") pod \"6b424e62-82ef-4540-b049-c673cce6de0e\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.926451 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data\") pod \"6b424e62-82ef-4540-b049-c673cce6de0e\" (UID: \"6b424e62-82ef-4540-b049-c673cce6de0e\") " Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.947816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj" (OuterVolumeSpecName: "kube-api-access-l98rj") pod "6b424e62-82ef-4540-b049-c673cce6de0e" (UID: "6b424e62-82ef-4540-b049-c673cce6de0e"). InnerVolumeSpecName "kube-api-access-l98rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.961935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b424e62-82ef-4540-b049-c673cce6de0e" (UID: "6b424e62-82ef-4540-b049-c673cce6de0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:37 crc kubenswrapper[4749]: I0225 07:39:37.966312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data" (OuterVolumeSpecName: "config-data") pod "6b424e62-82ef-4540-b049-c673cce6de0e" (UID: "6b424e62-82ef-4540-b049-c673cce6de0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.029025 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.029051 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l98rj\" (UniqueName: \"kubernetes.io/projected/6b424e62-82ef-4540-b049-c673cce6de0e-kube-api-access-l98rj\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.029061 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b424e62-82ef-4540-b049-c673cce6de0e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.235617 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.248782 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.261913 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:38 crc kubenswrapper[4749]: E0225 07:39:38.262295 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b424e62-82ef-4540-b049-c673cce6de0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.262312 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b424e62-82ef-4540-b049-c673cce6de0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.262495 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b424e62-82ef-4540-b049-c673cce6de0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.263174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.272169 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.272630 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.275513 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.276612 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.435671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.435735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.435969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drxlx\" (UniqueName: \"kubernetes.io/projected/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-kube-api-access-drxlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.436049 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.436084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.542084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.542126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.542229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.542250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.542306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drxlx\" (UniqueName: \"kubernetes.io/projected/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-kube-api-access-drxlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.546031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.546955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.547447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.548413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.560189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drxlx\" (UniqueName: \"kubernetes.io/projected/e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08-kube-api-access-drxlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08\") " pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:38 crc kubenswrapper[4749]: I0225 07:39:38.581788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.051468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 25 07:39:39 crc kubenswrapper[4749]: W0225 07:39:39.057168 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c9f2e8_f03e_40a1_bfa8_ea1fc396ad08.slice/crio-af21b9a2f6469e99776586de5d8b570ff05dd6a8d3ea9a343116687d5fd1cf75 WatchSource:0}: Error finding container af21b9a2f6469e99776586de5d8b570ff05dd6a8d3ea9a343116687d5fd1cf75: Status 404 returned error can't find the container with id af21b9a2f6469e99776586de5d8b570ff05dd6a8d3ea9a343116687d5fd1cf75 Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.341556 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b424e62-82ef-4540-b049-c673cce6de0e" path="/var/lib/kubelet/pods/6b424e62-82ef-4540-b049-c673cce6de0e/volumes" Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.944256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08","Type":"ContainerStarted","Data":"59dc4d2cd05226eab4fdeed98c1cd939a14aafcd1a5dd9485b5e423a627b462a"} Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.944640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08","Type":"ContainerStarted","Data":"af21b9a2f6469e99776586de5d8b570ff05dd6a8d3ea9a343116687d5fd1cf75"} Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.946805 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd427e4d-d068-482b-bbd1-d71916608054" containerID="8d270c819cb2df8bbda86ed33bcba2e36abe9c4f47f29052de47d04b43bd218c" exitCode=0 Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.946852 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerDied","Data":"8d270c819cb2df8bbda86ed33bcba2e36abe9c4f47f29052de47d04b43bd218c"} Feb 25 07:39:39 crc kubenswrapper[4749]: I0225 07:39:39.971502 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.971483804 podStartE2EDuration="1.971483804s" podCreationTimestamp="2026-02-25 07:39:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:39.962215601 +0000 UTC m=+1333.324041621" watchObservedRunningTime="2026-02-25 07:39:39.971483804 +0000 UTC m=+1333.333309824" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.209521 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.379842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gzbl\" (UniqueName: \"kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380231 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380335 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.380395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.382192 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.382273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.390086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl" (OuterVolumeSpecName: "kube-api-access-7gzbl") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "kube-api-access-7gzbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.409508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts" (OuterVolumeSpecName: "scripts") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.412467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.437610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.459038 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.481494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data" (OuterVolumeSpecName: "config-data") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.481995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") pod \"bd427e4d-d068-482b-bbd1-d71916608054\" (UID: \"bd427e4d-d068-482b-bbd1-d71916608054\") " Feb 25 07:39:40 crc kubenswrapper[4749]: W0225 07:39:40.482117 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bd427e4d-d068-482b-bbd1-d71916608054/volumes/kubernetes.io~secret/config-data Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482152 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data" (OuterVolumeSpecName: "config-data") pod "bd427e4d-d068-482b-bbd1-d71916608054" (UID: "bd427e4d-d068-482b-bbd1-d71916608054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482777 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482809 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gzbl\" (UniqueName: \"kubernetes.io/projected/bd427e4d-d068-482b-bbd1-d71916608054-kube-api-access-7gzbl\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482830 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482844 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482855 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482867 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd427e4d-d068-482b-bbd1-d71916608054-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482879 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.482893 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd427e4d-d068-482b-bbd1-d71916608054-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.511336 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.585400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle\") pod \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.585523 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data\") pod \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.585639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs\") pod \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.585731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz7qf\" (UniqueName: \"kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf\") pod \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\" (UID: \"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e\") " Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.587304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs" (OuterVolumeSpecName: "logs") pod "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" (UID: "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.592085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf" (OuterVolumeSpecName: "kube-api-access-vz7qf") pod "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" (UID: "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e"). InnerVolumeSpecName "kube-api-access-vz7qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.619178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data" (OuterVolumeSpecName: "config-data") pod "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" (UID: "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.636146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" (UID: "c44b42a6-ec9f-4317-b25d-d004f9b2fa2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.688304 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.688343 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.688356 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.688367 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz7qf\" (UniqueName: \"kubernetes.io/projected/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e-kube-api-access-vz7qf\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.978766 4749 generic.go:334] "Generic (PLEG): container finished" podID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerID="ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4" exitCode=0 Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.978797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerDied","Data":"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4"} Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.978842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c44b42a6-ec9f-4317-b25d-d004f9b2fa2e","Type":"ContainerDied","Data":"b6f4e1aca283b424660ad1ebc6763fde21992b749fc859cf12cddea9d0377ce5"} Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.978861 4749 scope.go:117] "RemoveContainer" containerID="ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.978994 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.982717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:40 crc kubenswrapper[4749]: I0225 07:39:40.985116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd427e4d-d068-482b-bbd1-d71916608054","Type":"ContainerDied","Data":"fcf5bf3bd534a214fa7819b43587b62b7e5c3a62107deadb05cdcdb573760b46"} Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.043126 4749 scope.go:117] "RemoveContainer" containerID="0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.055784 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.072355 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.084889 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098025 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098496 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-api" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098519 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-api" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098539 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-notification-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098560 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-notification-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098574 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="proxy-httpd" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098582 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="proxy-httpd" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098624 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-log" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098631 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-log" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-central-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098654 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-central-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.098672 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="sg-core" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098678 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="sg-core" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098850 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-central-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098859 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="sg-core" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098866 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="proxy-httpd" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098872 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-log" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098960 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" containerName="nova-api-api" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.098967 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd427e4d-d068-482b-bbd1-d71916608054" containerName="ceilometer-notification-agent" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.101033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.104872 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.105176 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.105302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.105457 4749 scope.go:117] "RemoveContainer" containerID="ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.106146 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4\": container with ID starting with ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4 not found: ID does not exist" containerID="ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.106173 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4"} err="failed to get container status \"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4\": rpc error: code = NotFound desc = could not find container \"ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4\": container with ID starting with ee3e37df99482655b5c92910ee0399255ca801296c7cf1713dbebcd913065ea4 not found: ID does not exist" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.106194 4749 scope.go:117] "RemoveContainer" containerID="0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c" Feb 25 07:39:41 crc kubenswrapper[4749]: E0225 07:39:41.109861 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c\": container with ID starting with 0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c not found: ID does not exist" containerID="0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.109896 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c"} err="failed to get container status \"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c\": rpc error: code = NotFound desc = could not find container \"0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c\": container with ID starting with 0a6faa4685fa65814b85873bcaa2cbccce881715546acb6dc427fb0f2c2a499c not found: ID does not exist" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.109915 4749 scope.go:117] "RemoveContainer" containerID="5fb093357ddca1dd56146dfb176f613a9599438d3c987ab578dfe09fee5fe988" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.111007 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.125574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.135823 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.137820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.140030 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.140144 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.140100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.148743 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.166109 4749 scope.go:117] "RemoveContainer" containerID="64d75d755e7515e0722d3c224c29aeb55c695c3d4e5c6a67676544802d444e35" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.192688 4749 scope.go:117] "RemoveContainer" containerID="8d270c819cb2df8bbda86ed33bcba2e36abe9c4f47f29052de47d04b43bd218c" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-log-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-scripts\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwm79\" (UniqueName: \"kubernetes.io/projected/b3e4586b-587a-4c8f-9387-70cb52411a46-kube-api-access-kwm79\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-config-data\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.207888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-run-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.210507 4749 scope.go:117] "RemoveContainer" containerID="77b36762d92e201fe16de9a8b70191ed1b561797551f3eab10d13ff4945450de" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.310046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwm79\" (UniqueName: \"kubernetes.io/projected/b3e4586b-587a-4c8f-9387-70cb52411a46-kube-api-access-kwm79\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.310100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.310168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-config-data\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-run-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5k4\" (UniqueName: \"kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-log-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311622 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.311708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-scripts\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.312112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-run-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.312422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b3e4586b-587a-4c8f-9387-70cb52411a46-log-httpd\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.313829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.315658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.316176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-config-data\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.322158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.333682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3e4586b-587a-4c8f-9387-70cb52411a46-scripts\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.339213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwm79\" (UniqueName: \"kubernetes.io/projected/b3e4586b-587a-4c8f-9387-70cb52411a46-kube-api-access-kwm79\") pod \"ceilometer-0\" (UID: \"b3e4586b-587a-4c8f-9387-70cb52411a46\") " pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.341247 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd427e4d-d068-482b-bbd1-d71916608054" path="/var/lib/kubelet/pods/bd427e4d-d068-482b-bbd1-d71916608054/volumes" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.342176 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c44b42a6-ec9f-4317-b25d-d004f9b2fa2e" path="/var/lib/kubelet/pods/c44b42a6-ec9f-4317-b25d-d004f9b2fa2e/volumes" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.412789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.412907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.412950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.413003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5k4\" (UniqueName: \"kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.413036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.413076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.413557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.416786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.416865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.417034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.419095 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.425086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.428850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5k4\" (UniqueName: \"kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4\") pod \"nova-api-0\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.452392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.952438 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 25 07:39:41 crc kubenswrapper[4749]: W0225 07:39:41.957953 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e4586b_587a_4c8f_9387_70cb52411a46.slice/crio-28f7d1834e1074d3bcb9765f5c5864426316ef3ce5c8c7cc1872eac6d95e0fc0 WatchSource:0}: Error finding container 28f7d1834e1074d3bcb9765f5c5864426316ef3ce5c8c7cc1872eac6d95e0fc0: Status 404 returned error can't find the container with id 28f7d1834e1074d3bcb9765f5c5864426316ef3ce5c8c7cc1872eac6d95e0fc0 Feb 25 07:39:41 crc kubenswrapper[4749]: I0225 07:39:41.991492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3e4586b-587a-4c8f-9387-70cb52411a46","Type":"ContainerStarted","Data":"28f7d1834e1074d3bcb9765f5c5864426316ef3ce5c8c7cc1872eac6d95e0fc0"} Feb 25 07:39:42 crc kubenswrapper[4749]: I0225 07:39:42.082345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.006104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3e4586b-587a-4c8f-9387-70cb52411a46","Type":"ContainerStarted","Data":"99a0463e2f954de1755e4469ab97749370eee20b764637566220ace8d2de59a1"} Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.009374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerStarted","Data":"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd"} Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.009420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerStarted","Data":"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea"} Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.009432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerStarted","Data":"718205088c6eff5c06faa49f6b9cecb0a9337a3676e18110c2ee2a09974a5331"} Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.041437 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.041414459 podStartE2EDuration="2.041414459s" podCreationTimestamp="2026-02-25 07:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:43.037031444 +0000 UTC m=+1336.398857464" watchObservedRunningTime="2026-02-25 07:39:43.041414459 +0000 UTC m=+1336.403240479" Feb 25 07:39:43 crc kubenswrapper[4749]: I0225 07:39:43.582726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:44 crc kubenswrapper[4749]: I0225 07:39:44.020312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3e4586b-587a-4c8f-9387-70cb52411a46","Type":"ContainerStarted","Data":"66652aa7ff42a957abb44396365019c065046a44ea52be819da57ab3eb6d7fbf"} Feb 25 07:39:44 crc kubenswrapper[4749]: I0225 07:39:44.020385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3e4586b-587a-4c8f-9387-70cb52411a46","Type":"ContainerStarted","Data":"967f0464ec00433400f57b2bba7efcd0c3de370fa0053f47647a552052cd9c98"} Feb 25 07:39:44 crc kubenswrapper[4749]: I0225 07:39:44.366529 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:39:44 crc kubenswrapper[4749]: I0225 07:39:44.469266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:44 crc kubenswrapper[4749]: I0225 07:39:44.469718 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="dnsmasq-dns" containerID="cri-o://73251f4d1d186f22486ad55e47907bae7d9f70049407db8c086b5b7e446dcd4f" gracePeriod=10 Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.031357 4749 generic.go:334] "Generic (PLEG): container finished" podID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerID="73251f4d1d186f22486ad55e47907bae7d9f70049407db8c086b5b7e446dcd4f" exitCode=0 Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.031673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" event={"ID":"37f54cc1-6ecd-435a-b969-fceaa88d6d4f","Type":"ContainerDied","Data":"73251f4d1d186f22486ad55e47907bae7d9f70049407db8c086b5b7e446dcd4f"} Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.031702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" event={"ID":"37f54cc1-6ecd-435a-b969-fceaa88d6d4f","Type":"ContainerDied","Data":"cb0239b4fabff63dcf445a9b8d7a8dc670507903714edc301e07e9aa571f5f69"} Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.031713 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb0239b4fabff63dcf445a9b8d7a8dc670507903714edc301e07e9aa571f5f69" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.037904 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.221497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.221573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.222325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.222587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.222634 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vckl\" (UniqueName: \"kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.222746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb\") pod \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\" (UID: \"37f54cc1-6ecd-435a-b969-fceaa88d6d4f\") " Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.228152 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl" (OuterVolumeSpecName: "kube-api-access-6vckl") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "kube-api-access-6vckl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.278255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.290436 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.299387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config" (OuterVolumeSpecName: "config") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.300854 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.303608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37f54cc1-6ecd-435a-b969-fceaa88d6d4f" (UID: "37f54cc1-6ecd-435a-b969-fceaa88d6d4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325187 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325221 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325235 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325247 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325259 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vckl\" (UniqueName: \"kubernetes.io/projected/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-kube-api-access-6vckl\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:45 crc kubenswrapper[4749]: I0225 07:39:45.325270 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37f54cc1-6ecd-435a-b969-fceaa88d6d4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.042319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b3e4586b-587a-4c8f-9387-70cb52411a46","Type":"ContainerStarted","Data":"32afb7240ea3a1b37cadd886964003e1fb82d6dd99c06dd56cc637fe6f78dfed"} Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.042660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.042338 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-d5nx6" Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.077825 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.695922414 podStartE2EDuration="5.077809084s" podCreationTimestamp="2026-02-25 07:39:41 +0000 UTC" firstStartedPulling="2026-02-25 07:39:41.960301105 +0000 UTC m=+1335.322127125" lastFinishedPulling="2026-02-25 07:39:45.342187775 +0000 UTC m=+1338.704013795" observedRunningTime="2026-02-25 07:39:46.067424953 +0000 UTC m=+1339.429250973" watchObservedRunningTime="2026-02-25 07:39:46.077809084 +0000 UTC m=+1339.439635104" Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.089463 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:46 crc kubenswrapper[4749]: I0225 07:39:46.097686 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-d5nx6"] Feb 25 07:39:47 crc kubenswrapper[4749]: I0225 07:39:47.335814 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" path="/var/lib/kubelet/pods/37f54cc1-6ecd-435a-b969-fceaa88d6d4f/volumes" Feb 25 07:39:48 crc kubenswrapper[4749]: I0225 07:39:48.583110 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:48 crc kubenswrapper[4749]: I0225 07:39:48.613717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.099872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.385179 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pcvmq"] Feb 25 07:39:49 crc kubenswrapper[4749]: E0225 07:39:49.385670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="init" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.385692 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="init" Feb 25 07:39:49 crc kubenswrapper[4749]: E0225 07:39:49.385711 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="dnsmasq-dns" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.385718 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="dnsmasq-dns" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.385973 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f54cc1-6ecd-435a-b969-fceaa88d6d4f" containerName="dnsmasq-dns" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.387153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.395304 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.395416 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.397135 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pcvmq"] Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.510353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcsgh\" (UniqueName: \"kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.510755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.510827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.510866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.612366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.612712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcsgh\" (UniqueName: \"kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.612780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.612848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.619635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.623294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.624044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.642848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcsgh\" (UniqueName: \"kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh\") pod \"nova-cell1-cell-mapping-pcvmq\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:49 crc kubenswrapper[4749]: I0225 07:39:49.714882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:50 crc kubenswrapper[4749]: I0225 07:39:50.299002 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pcvmq"] Feb 25 07:39:51 crc kubenswrapper[4749]: I0225 07:39:51.095364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pcvmq" event={"ID":"d5010dcd-fe23-46b8-8df4-f955ce98e324","Type":"ContainerStarted","Data":"0516d6b693dd90cb486eb534267e8f17e563026993c67fb617028dd39bce440f"} Feb 25 07:39:51 crc kubenswrapper[4749]: I0225 07:39:51.095719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pcvmq" event={"ID":"d5010dcd-fe23-46b8-8df4-f955ce98e324","Type":"ContainerStarted","Data":"8ef310d595a63480ab017699c32ac67f8e909ae781a5d79c0556b390eeb9b382"} Feb 25 07:39:51 crc kubenswrapper[4749]: I0225 07:39:51.119076 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pcvmq" podStartSLOduration=2.119056356 podStartE2EDuration="2.119056356s" podCreationTimestamp="2026-02-25 07:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:39:51.114855554 +0000 UTC m=+1344.476681564" watchObservedRunningTime="2026-02-25 07:39:51.119056356 +0000 UTC m=+1344.480882376" Feb 25 07:39:51 crc kubenswrapper[4749]: I0225 07:39:51.452866 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:39:51 crc kubenswrapper[4749]: I0225 07:39:51.452941 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:39:52 crc kubenswrapper[4749]: I0225 07:39:52.465842 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:52 crc kubenswrapper[4749]: I0225 07:39:52.465863 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:39:55 crc kubenswrapper[4749]: I0225 07:39:55.151860 4749 generic.go:334] "Generic (PLEG): container finished" podID="d5010dcd-fe23-46b8-8df4-f955ce98e324" containerID="0516d6b693dd90cb486eb534267e8f17e563026993c67fb617028dd39bce440f" exitCode=0 Feb 25 07:39:55 crc kubenswrapper[4749]: I0225 07:39:55.151927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pcvmq" event={"ID":"d5010dcd-fe23-46b8-8df4-f955ce98e324","Type":"ContainerDied","Data":"0516d6b693dd90cb486eb534267e8f17e563026993c67fb617028dd39bce440f"} Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.622959 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.760933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle\") pod \"d5010dcd-fe23-46b8-8df4-f955ce98e324\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.761036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data\") pod \"d5010dcd-fe23-46b8-8df4-f955ce98e324\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.761127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcsgh\" (UniqueName: \"kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh\") pod \"d5010dcd-fe23-46b8-8df4-f955ce98e324\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.761174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts\") pod \"d5010dcd-fe23-46b8-8df4-f955ce98e324\" (UID: \"d5010dcd-fe23-46b8-8df4-f955ce98e324\") " Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.766452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh" (OuterVolumeSpecName: "kube-api-access-bcsgh") pod "d5010dcd-fe23-46b8-8df4-f955ce98e324" (UID: "d5010dcd-fe23-46b8-8df4-f955ce98e324"). InnerVolumeSpecName "kube-api-access-bcsgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.767092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts" (OuterVolumeSpecName: "scripts") pod "d5010dcd-fe23-46b8-8df4-f955ce98e324" (UID: "d5010dcd-fe23-46b8-8df4-f955ce98e324"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.797484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data" (OuterVolumeSpecName: "config-data") pod "d5010dcd-fe23-46b8-8df4-f955ce98e324" (UID: "d5010dcd-fe23-46b8-8df4-f955ce98e324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.801998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5010dcd-fe23-46b8-8df4-f955ce98e324" (UID: "d5010dcd-fe23-46b8-8df4-f955ce98e324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.863364 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-scripts\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.863413 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.863426 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5010dcd-fe23-46b8-8df4-f955ce98e324-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:56 crc kubenswrapper[4749]: I0225 07:39:56.863438 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcsgh\" (UniqueName: \"kubernetes.io/projected/d5010dcd-fe23-46b8-8df4-f955ce98e324-kube-api-access-bcsgh\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.180536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pcvmq" event={"ID":"d5010dcd-fe23-46b8-8df4-f955ce98e324","Type":"ContainerDied","Data":"8ef310d595a63480ab017699c32ac67f8e909ae781a5d79c0556b390eeb9b382"} Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.180971 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef310d595a63480ab017699c32ac67f8e909ae781a5d79c0556b390eeb9b382" Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.180644 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pcvmq" Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.384645 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.384933 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="426a8ace-3a5f-4695-8f7f-0710b1418b00" containerName="nova-scheduler-scheduler" containerID="cri-o://e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4" gracePeriod=30 Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.402071 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.402405 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-api" containerID="cri-o://b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd" gracePeriod=30 Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.402859 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-log" containerID="cri-o://311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea" gracePeriod=30 Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.465520 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.465940 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" containerID="cri-o://508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db" gracePeriod=30 Feb 25 07:39:57 crc kubenswrapper[4749]: I0225 07:39:57.466537 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" containerID="cri-o://19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de" gracePeriod=30 Feb 25 07:39:58 crc kubenswrapper[4749]: I0225 07:39:58.194049 4749 generic.go:334] "Generic (PLEG): container finished" podID="55a553fd-17dd-4a2b-9309-6f2777567277" containerID="311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea" exitCode=143 Feb 25 07:39:58 crc kubenswrapper[4749]: I0225 07:39:58.194132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerDied","Data":"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea"} Feb 25 07:39:58 crc kubenswrapper[4749]: I0225 07:39:58.197724 4749 generic.go:334] "Generic (PLEG): container finished" podID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerID="508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db" exitCode=143 Feb 25 07:39:58 crc kubenswrapper[4749]: I0225 07:39:58.197760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerDied","Data":"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db"} Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.121202 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.209980 4749 generic.go:334] "Generic (PLEG): container finished" podID="426a8ace-3a5f-4695-8f7f-0710b1418b00" containerID="e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4" exitCode=0 Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.210092 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.210738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"426a8ace-3a5f-4695-8f7f-0710b1418b00","Type":"ContainerDied","Data":"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4"} Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.210792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"426a8ace-3a5f-4695-8f7f-0710b1418b00","Type":"ContainerDied","Data":"60a7a63df348b0a68297f147be6c3a63a62fb3b8f70a85c7181e474fb64bf935"} Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.210816 4749 scope.go:117] "RemoveContainer" containerID="e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.222477 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44f6k\" (UniqueName: \"kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k\") pod \"426a8ace-3a5f-4695-8f7f-0710b1418b00\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.222666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle\") pod \"426a8ace-3a5f-4695-8f7f-0710b1418b00\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.222710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data\") pod \"426a8ace-3a5f-4695-8f7f-0710b1418b00\" (UID: \"426a8ace-3a5f-4695-8f7f-0710b1418b00\") " Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.231770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k" (OuterVolumeSpecName: "kube-api-access-44f6k") pod "426a8ace-3a5f-4695-8f7f-0710b1418b00" (UID: "426a8ace-3a5f-4695-8f7f-0710b1418b00"). InnerVolumeSpecName "kube-api-access-44f6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.256112 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "426a8ace-3a5f-4695-8f7f-0710b1418b00" (UID: "426a8ace-3a5f-4695-8f7f-0710b1418b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.266534 4749 scope.go:117] "RemoveContainer" containerID="e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4" Feb 25 07:39:59 crc kubenswrapper[4749]: E0225 07:39:59.268037 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4\": container with ID starting with e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4 not found: ID does not exist" containerID="e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.268082 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4"} err="failed to get container status \"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4\": rpc error: code = NotFound desc = could not find container \"e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4\": container with ID starting with e0c4a7d7984482482531eb3d0dc49390be1251acbad4f32f7dcf6c990a0e9cf4 not found: ID does not exist" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.282509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data" (OuterVolumeSpecName: "config-data") pod "426a8ace-3a5f-4695-8f7f-0710b1418b00" (UID: "426a8ace-3a5f-4695-8f7f-0710b1418b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.324856 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.324889 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/426a8ace-3a5f-4695-8f7f-0710b1418b00-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.324898 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44f6k\" (UniqueName: \"kubernetes.io/projected/426a8ace-3a5f-4695-8f7f-0710b1418b00-kube-api-access-44f6k\") on node \"crc\" DevicePath \"\"" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.529078 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.535644 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.544865 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:59 crc kubenswrapper[4749]: E0225 07:39:59.545418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5010dcd-fe23-46b8-8df4-f955ce98e324" containerName="nova-manage" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.545476 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5010dcd-fe23-46b8-8df4-f955ce98e324" containerName="nova-manage" Feb 25 07:39:59 crc kubenswrapper[4749]: E0225 07:39:59.545561 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426a8ace-3a5f-4695-8f7f-0710b1418b00" containerName="nova-scheduler-scheduler" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.545629 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="426a8ace-3a5f-4695-8f7f-0710b1418b00" containerName="nova-scheduler-scheduler" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.545853 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="426a8ace-3a5f-4695-8f7f-0710b1418b00" containerName="nova-scheduler-scheduler" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.545932 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5010dcd-fe23-46b8-8df4-f955ce98e324" containerName="nova-manage" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.546525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.549061 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.554196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.731999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.732471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-config-data\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.732845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglz4\" (UniqueName: \"kubernetes.io/projected/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-kube-api-access-sglz4\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.835502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.835641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-config-data\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.835813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglz4\" (UniqueName: \"kubernetes.io/projected/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-kube-api-access-sglz4\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.840678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.842327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-config-data\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.858082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglz4\" (UniqueName: \"kubernetes.io/projected/adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79-kube-api-access-sglz4\") pod \"nova-scheduler-0\" (UID: \"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79\") " pod="openstack/nova-scheduler-0" Feb 25 07:39:59 crc kubenswrapper[4749]: I0225 07:39:59.863205 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.156985 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533420-ktflf"] Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.158842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.161715 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.161963 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.162383 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.168137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533420-ktflf"] Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.346547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbbd\" (UniqueName: \"kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd\") pod \"auto-csr-approver-29533420-ktflf\" (UID: \"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0\") " pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.381429 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.487910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dbbd\" (UniqueName: \"kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd\") pod \"auto-csr-approver-29533420-ktflf\" (UID: \"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0\") " pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.503711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dbbd\" (UniqueName: \"kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd\") pod \"auto-csr-approver-29533420-ktflf\" (UID: \"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0\") " pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.595686 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:39996->10.217.0.209:8775: read: connection reset by peer" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.595725 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:39992->10.217.0.209:8775: read: connection reset by peer" Feb 25 07:40:00 crc kubenswrapper[4749]: I0225 07:40:00.788546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.037777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.068410 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s5k4\" (UniqueName: \"kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099410 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjl6m\" (UniqueName: \"kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m\") pod \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data\") pod \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs\") pod \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle\") pod \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs\") pod \"55a553fd-17dd-4a2b-9309-6f2777567277\" (UID: \"55a553fd-17dd-4a2b-9309-6f2777567277\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.099937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs\") pod \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\" (UID: \"2cd271b1-f5b0-432d-bd41-f53c28744b6e\") " Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.102014 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs" (OuterVolumeSpecName: "logs") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.113049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs" (OuterVolumeSpecName: "logs") pod "2cd271b1-f5b0-432d-bd41-f53c28744b6e" (UID: "2cd271b1-f5b0-432d-bd41-f53c28744b6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.124399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4" (OuterVolumeSpecName: "kube-api-access-9s5k4") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "kube-api-access-9s5k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.127818 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m" (OuterVolumeSpecName: "kube-api-access-bjl6m") pod "2cd271b1-f5b0-432d-bd41-f53c28744b6e" (UID: "2cd271b1-f5b0-432d-bd41-f53c28744b6e"). InnerVolumeSpecName "kube-api-access-bjl6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.165413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd271b1-f5b0-432d-bd41-f53c28744b6e" (UID: "2cd271b1-f5b0-432d-bd41-f53c28744b6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.168086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data" (OuterVolumeSpecName: "config-data") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.171073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.189848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data" (OuterVolumeSpecName: "config-data") pod "2cd271b1-f5b0-432d-bd41-f53c28744b6e" (UID: "2cd271b1-f5b0-432d-bd41-f53c28744b6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.198422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2cd271b1-f5b0-432d-bd41-f53c28744b6e" (UID: "2cd271b1-f5b0-432d-bd41-f53c28744b6e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201285 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201313 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s5k4\" (UniqueName: \"kubernetes.io/projected/55a553fd-17dd-4a2b-9309-6f2777567277-kube-api-access-9s5k4\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201325 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201334 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjl6m\" (UniqueName: \"kubernetes.io/projected/2cd271b1-f5b0-432d-bd41-f53c28744b6e-kube-api-access-bjl6m\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201344 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201354 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201363 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a553fd-17dd-4a2b-9309-6f2777567277-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201371 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd271b1-f5b0-432d-bd41-f53c28744b6e-logs\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.201378 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd271b1-f5b0-432d-bd41-f53c28744b6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.219996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.220467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55a553fd-17dd-4a2b-9309-6f2777567277" (UID: "55a553fd-17dd-4a2b-9309-6f2777567277"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.234089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79","Type":"ContainerStarted","Data":"eca1c1c7328ec284bf668e33c960f17fc6bf7221a251c33dc6fdcbb5330b2822"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.234388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79","Type":"ContainerStarted","Data":"fa8c7725768f77d7fa119bf070bf0e4b2899a03296ed5bb6bccf64ec1e891be0"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.242728 4749 generic.go:334] "Generic (PLEG): container finished" podID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerID="19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de" exitCode=0 Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.242893 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.244118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerDied","Data":"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.244194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2cd271b1-f5b0-432d-bd41-f53c28744b6e","Type":"ContainerDied","Data":"fbde5e2d33077ee2ec05cdab5c8691e53052f9ed34d034895deb7261a6707df6"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.244258 4749 scope.go:117] "RemoveContainer" containerID="19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.252196 4749 generic.go:334] "Generic (PLEG): container finished" podID="55a553fd-17dd-4a2b-9309-6f2777567277" containerID="b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd" exitCode=0 Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.252233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerDied","Data":"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.252278 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.252309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55a553fd-17dd-4a2b-9309-6f2777567277","Type":"ContainerDied","Data":"718205088c6eff5c06faa49f6b9cecb0a9337a3676e18110c2ee2a09974a5331"} Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.253112 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.253102055 podStartE2EDuration="2.253102055s" podCreationTimestamp="2026-02-25 07:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:40:01.248116804 +0000 UTC m=+1354.609942824" watchObservedRunningTime="2026-02-25 07:40:01.253102055 +0000 UTC m=+1354.614928075" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.289196 4749 scope.go:117] "RemoveContainer" containerID="508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.289331 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.299099 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.302963 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.302985 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55a553fd-17dd-4a2b-9309-6f2777567277-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.314625 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.326728 4749 scope.go:117] "RemoveContainer" containerID="19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.327161 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de\": container with ID starting with 19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de not found: ID does not exist" containerID="19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.327193 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de"} err="failed to get container status \"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de\": rpc error: code = NotFound desc = could not find container \"19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de\": container with ID starting with 19a95cb2909c30e3a1e317179e17cda185a652915e59e329c53e70ebaf00d9de not found: ID does not exist" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.327212 4749 scope.go:117] "RemoveContainer" containerID="508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.327701 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db\": container with ID starting with 508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db not found: ID does not exist" containerID="508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.327730 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db"} err="failed to get container status \"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db\": rpc error: code = NotFound desc = could not find container \"508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db\": container with ID starting with 508bd6223aa1082f94ccb2a3a2541afa010370c8703cb394518e7df8d68660db not found: ID does not exist" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.327742 4749 scope.go:117] "RemoveContainer" containerID="b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.345419 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426a8ace-3a5f-4695-8f7f-0710b1418b00" path="/var/lib/kubelet/pods/426a8ace-3a5f-4695-8f7f-0710b1418b00/volumes" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.347156 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" path="/var/lib/kubelet/pods/55a553fd-17dd-4a2b-9309-6f2777567277/volumes" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.347733 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.347758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.348074 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-api" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348086 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-api" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.348100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-log" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348106 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-log" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.348141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348148 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.348159 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348379 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-log" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348392 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" containerName="nova-metadata-metadata" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348409 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-api" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.348422 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a553fd-17dd-4a2b-9309-6f2777567277" containerName="nova-api-log" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.350068 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533420-ktflf"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.350148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.355262 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.355555 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.356212 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.357068 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.365102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.375085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.375827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.376611 4749 scope.go:117] "RemoveContainer" containerID="311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.378023 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.378537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.397647 4749 scope.go:117] "RemoveContainer" containerID="b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.400073 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd\": container with ID starting with b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd not found: ID does not exist" containerID="b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.400125 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd"} err="failed to get container status \"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd\": rpc error: code = NotFound desc = could not find container \"b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd\": container with ID starting with b6b28a2849ccb26ce5ca1c5bdfd2308cd2a7dda4c78cfe2c2a33290863d519bd not found: ID does not exist" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.400155 4749 scope.go:117] "RemoveContainer" containerID="311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea" Feb 25 07:40:01 crc kubenswrapper[4749]: E0225 07:40:01.400541 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea\": container with ID starting with 311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea not found: ID does not exist" containerID="311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.400583 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea"} err="failed to get container status \"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea\": rpc error: code = NotFound desc = could not find container \"311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea\": container with ID starting with 311bac65478231f94872a43124d4999d678b6cab302489ea468254767c75c5ea not found: ID does not exist" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzgc\" (UniqueName: \"kubernetes.io/projected/b8b4782e-5a90-4774-819b-dc12f4c1b585-kube-api-access-vkzgc\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-config-data\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b4782e-5a90-4774-819b-dc12f4c1b585-logs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmp4\" (UniqueName: \"kubernetes.io/projected/ea427259-a5cd-455d-a3c3-7031a607e42c-kube-api-access-crmp4\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea427259-a5cd-455d-a3c3-7031a607e42c-logs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.405660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-config-data\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea427259-a5cd-455d-a3c3-7031a607e42c-logs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-config-data\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzgc\" (UniqueName: \"kubernetes.io/projected/b8b4782e-5a90-4774-819b-dc12f4c1b585-kube-api-access-vkzgc\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-config-data\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b4782e-5a90-4774-819b-dc12f4c1b585-logs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.507887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmp4\" (UniqueName: \"kubernetes.io/projected/ea427259-a5cd-455d-a3c3-7031a607e42c-kube-api-access-crmp4\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.508990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea427259-a5cd-455d-a3c3-7031a607e42c-logs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.510141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8b4782e-5a90-4774-819b-dc12f4c1b585-logs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.514468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.514867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-public-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.514934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-config-data\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.517975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8b4782e-5a90-4774-819b-dc12f4c1b585-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.519490 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-config-data\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.520757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.522118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmp4\" (UniqueName: \"kubernetes.io/projected/ea427259-a5cd-455d-a3c3-7031a607e42c-kube-api-access-crmp4\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.523926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea427259-a5cd-455d-a3c3-7031a607e42c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ea427259-a5cd-455d-a3c3-7031a607e42c\") " pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.526502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzgc\" (UniqueName: \"kubernetes.io/projected/b8b4782e-5a90-4774-819b-dc12f4c1b585-kube-api-access-vkzgc\") pod \"nova-metadata-0\" (UID: \"b8b4782e-5a90-4774-819b-dc12f4c1b585\") " pod="openstack/nova-metadata-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.672795 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 25 07:40:01 crc kubenswrapper[4749]: I0225 07:40:01.695028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 25 07:40:02 crc kubenswrapper[4749]: I0225 07:40:02.147019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 25 07:40:02 crc kubenswrapper[4749]: W0225 07:40:02.217290 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b4782e_5a90_4774_819b_dc12f4c1b585.slice/crio-f04ae8baa6dd7086c166b1dc42c9f6f014c003d32963ff0467a3b63f3afce091 WatchSource:0}: Error finding container f04ae8baa6dd7086c166b1dc42c9f6f014c003d32963ff0467a3b63f3afce091: Status 404 returned error can't find the container with id f04ae8baa6dd7086c166b1dc42c9f6f014c003d32963ff0467a3b63f3afce091 Feb 25 07:40:02 crc kubenswrapper[4749]: I0225 07:40:02.222345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 25 07:40:02 crc kubenswrapper[4749]: I0225 07:40:02.263319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea427259-a5cd-455d-a3c3-7031a607e42c","Type":"ContainerStarted","Data":"e9ff6be46d5c5b24b81a4b89f04ccff6da8f96e26f134f240ae1b0b7cc26a85a"} Feb 25 07:40:02 crc kubenswrapper[4749]: I0225 07:40:02.264515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533420-ktflf" event={"ID":"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0","Type":"ContainerStarted","Data":"c28fdd8d7a3b00bf801cd8d4847d3b3c4cd92a1cd82efb312028f8afd4c7b9be"} Feb 25 07:40:02 crc kubenswrapper[4749]: I0225 07:40:02.265630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8b4782e-5a90-4774-819b-dc12f4c1b585","Type":"ContainerStarted","Data":"f04ae8baa6dd7086c166b1dc42c9f6f014c003d32963ff0467a3b63f3afce091"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.280289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea427259-a5cd-455d-a3c3-7031a607e42c","Type":"ContainerStarted","Data":"416df7901507c3572f9fc1e454dd20e0a2f31f8ff1406810d50c2f1413aea251"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.281050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ea427259-a5cd-455d-a3c3-7031a607e42c","Type":"ContainerStarted","Data":"52e4041110c05c8c5a1184a07157e20b1a6a1c01c7066f5f9121651ab98049a6"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.283726 4749 generic.go:334] "Generic (PLEG): container finished" podID="bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" containerID="3310a2a268d1fb19f153267b0dec4884f57c85e5ce309c5c1e88e823a73dc51a" exitCode=0 Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.283823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533420-ktflf" event={"ID":"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0","Type":"ContainerDied","Data":"3310a2a268d1fb19f153267b0dec4884f57c85e5ce309c5c1e88e823a73dc51a"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.286927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8b4782e-5a90-4774-819b-dc12f4c1b585","Type":"ContainerStarted","Data":"5b82fc1610693db2240ca29a27b4907f4fdc46129c0afc0abf6445f7410da5ff"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.286972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8b4782e-5a90-4774-819b-dc12f4c1b585","Type":"ContainerStarted","Data":"cb51df010c13d086e59cd55538c3ec067d73233c738b36dcfc01272f11ff14a9"} Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.327473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.3274447609999998 podStartE2EDuration="2.327444761s" podCreationTimestamp="2026-02-25 07:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:40:03.307862008 +0000 UTC m=+1356.669688088" watchObservedRunningTime="2026-02-25 07:40:03.327444761 +0000 UTC m=+1356.689270821" Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.347402 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd271b1-f5b0-432d-bd41-f53c28744b6e" path="/var/lib/kubelet/pods/2cd271b1-f5b0-432d-bd41-f53c28744b6e/volumes" Feb 25 07:40:03 crc kubenswrapper[4749]: I0225 07:40:03.384900 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.384868518 podStartE2EDuration="2.384868518s" podCreationTimestamp="2026-02-25 07:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:40:03.368573584 +0000 UTC m=+1356.730399644" watchObservedRunningTime="2026-02-25 07:40:03.384868518 +0000 UTC m=+1356.746694578" Feb 25 07:40:04 crc kubenswrapper[4749]: I0225 07:40:04.745032 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:04 crc kubenswrapper[4749]: I0225 07:40:04.863936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 25 07:40:04 crc kubenswrapper[4749]: I0225 07:40:04.893280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dbbd\" (UniqueName: \"kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd\") pod \"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0\" (UID: \"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0\") " Feb 25 07:40:04 crc kubenswrapper[4749]: I0225 07:40:04.898063 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd" (OuterVolumeSpecName: "kube-api-access-9dbbd") pod "bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" (UID: "bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0"). InnerVolumeSpecName "kube-api-access-9dbbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:04 crc kubenswrapper[4749]: I0225 07:40:04.996526 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dbbd\" (UniqueName: \"kubernetes.io/projected/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0-kube-api-access-9dbbd\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:05 crc kubenswrapper[4749]: I0225 07:40:05.312372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533420-ktflf" event={"ID":"bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0","Type":"ContainerDied","Data":"c28fdd8d7a3b00bf801cd8d4847d3b3c4cd92a1cd82efb312028f8afd4c7b9be"} Feb 25 07:40:05 crc kubenswrapper[4749]: I0225 07:40:05.312416 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c28fdd8d7a3b00bf801cd8d4847d3b3c4cd92a1cd82efb312028f8afd4c7b9be" Feb 25 07:40:05 crc kubenswrapper[4749]: I0225 07:40:05.312460 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533420-ktflf" Feb 25 07:40:05 crc kubenswrapper[4749]: I0225 07:40:05.847403 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533414-x578f"] Feb 25 07:40:05 crc kubenswrapper[4749]: I0225 07:40:05.860122 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533414-x578f"] Feb 25 07:40:06 crc kubenswrapper[4749]: I0225 07:40:06.695793 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 07:40:06 crc kubenswrapper[4749]: I0225 07:40:06.695879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 25 07:40:07 crc kubenswrapper[4749]: I0225 07:40:07.343272 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3089009-af37-4a9f-b86c-64282651c575" path="/var/lib/kubelet/pods/d3089009-af37-4a9f-b86c-64282651c575/volumes" Feb 25 07:40:09 crc kubenswrapper[4749]: I0225 07:40:09.864049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 25 07:40:09 crc kubenswrapper[4749]: I0225 07:40:09.909499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 25 07:40:10 crc kubenswrapper[4749]: I0225 07:40:10.423692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 25 07:40:11 crc kubenswrapper[4749]: I0225 07:40:11.430301 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 25 07:40:11 crc kubenswrapper[4749]: I0225 07:40:11.674527 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:40:11 crc kubenswrapper[4749]: I0225 07:40:11.674566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 25 07:40:11 crc kubenswrapper[4749]: I0225 07:40:11.696534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 07:40:11 crc kubenswrapper[4749]: I0225 07:40:11.697281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 25 07:40:12 crc kubenswrapper[4749]: I0225 07:40:12.690751 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea427259-a5cd-455d-a3c3-7031a607e42c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:40:12 crc kubenswrapper[4749]: I0225 07:40:12.690751 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ea427259-a5cd-455d-a3c3-7031a607e42c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:40:12 crc kubenswrapper[4749]: I0225 07:40:12.712842 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b8b4782e-5a90-4774-819b-dc12f4c1b585" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:40:12 crc kubenswrapper[4749]: I0225 07:40:12.712865 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b8b4782e-5a90-4774-819b-dc12f4c1b585" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.671989 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.672738 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.682696 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.683171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.686309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.689151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.702148 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.707378 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 25 07:40:21 crc kubenswrapper[4749]: I0225 07:40:21.719717 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 07:40:22 crc kubenswrapper[4749]: I0225 07:40:22.517557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 25 07:40:22 crc kubenswrapper[4749]: I0225 07:40:22.525994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 25 07:40:22 crc kubenswrapper[4749]: I0225 07:40:22.528877 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 25 07:40:30 crc kubenswrapper[4749]: I0225 07:40:30.693260 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:31 crc kubenswrapper[4749]: I0225 07:40:31.893184 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:34 crc kubenswrapper[4749]: I0225 07:40:34.488724 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="rabbitmq" containerID="cri-o://d218f78e714e77fdd26f9bd4efb44bcccf948b69774525348e530df7b0a6967d" gracePeriod=604797 Feb 25 07:40:35 crc kubenswrapper[4749]: I0225 07:40:35.907842 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="rabbitmq" containerID="cri-o://398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df" gracePeriod=604796 Feb 25 07:40:40 crc kubenswrapper[4749]: I0225 07:40:40.731068 4749 generic.go:334] "Generic (PLEG): container finished" podID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerID="d218f78e714e77fdd26f9bd4efb44bcccf948b69774525348e530df7b0a6967d" exitCode=0 Feb 25 07:40:40 crc kubenswrapper[4749]: I0225 07:40:40.731225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerDied","Data":"d218f78e714e77fdd26f9bd4efb44bcccf948b69774525348e530df7b0a6967d"} Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.102362 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.197213 4749 scope.go:117] "RemoveContainer" containerID="0ee6f92d7b6d3121615e47838af3ebed4597f6f31a2aa6b41501cbcd9488b75e" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfgxp\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271843 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.271970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.272000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.272024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.272070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.272127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.272162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret\") pod \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\" (UID: \"78a71a5d-2c51-4fd9-b1bf-f94393a3430e\") " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.273980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.274170 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.274180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.277727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.277765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.278823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp" (OuterVolumeSpecName: "kube-api-access-gfgxp") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "kube-api-access-gfgxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.281146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info" (OuterVolumeSpecName: "pod-info") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.289910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.339391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data" (OuterVolumeSpecName: "config-data") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.370369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf" (OuterVolumeSpecName: "server-conf") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374253 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374283 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374313 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374323 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374332 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374341 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfgxp\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-kube-api-access-gfgxp\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374349 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374360 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374370 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.374379 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.416684 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.417107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "78a71a5d-2c51-4fd9-b1bf-f94393a3430e" (UID: "78a71a5d-2c51-4fd9-b1bf-f94393a3430e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.476193 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/78a71a5d-2c51-4fd9-b1bf-f94393a3430e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.476222 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.813205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"78a71a5d-2c51-4fd9-b1bf-f94393a3430e","Type":"ContainerDied","Data":"acd0ea76057e6bb9b468d33b843f8e8701fd73e07169e1c88cf7fb42daf8b3c9"} Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.813525 4749 scope.go:117] "RemoveContainer" containerID="d218f78e714e77fdd26f9bd4efb44bcccf948b69774525348e530df7b0a6967d" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.813267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.885272 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.889716 4749 scope.go:117] "RemoveContainer" containerID="e913d3a116a8d0df455db3e543a9b853ca9274c3337b06248d654bc6853a9e36" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.897425 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.909987 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:41 crc kubenswrapper[4749]: E0225 07:40:41.910315 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="rabbitmq" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.910330 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="rabbitmq" Feb 25 07:40:41 crc kubenswrapper[4749]: E0225 07:40:41.910355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="setup-container" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.910361 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="setup-container" Feb 25 07:40:41 crc kubenswrapper[4749]: E0225 07:40:41.910384 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" containerName="oc" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.910391 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" containerName="oc" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.910553 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" containerName="rabbitmq" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.910567 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" containerName="oc" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.911462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.918501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.918877 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.918976 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.919099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.919169 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tqltk" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.919111 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.922786 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.947666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:41 crc kubenswrapper[4749]: I0225 07:40:41.950538 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.089775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.089911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.089956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.089988 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9dt5\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-kube-api-access-f9dt5\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-config-data\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.090662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9dt5\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-kube-api-access-f9dt5\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-config-data\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.192570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.193157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.193306 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.193395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.193674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-config-data\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.193957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.194854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.200526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.204643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.209036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.209062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9dt5\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-kube-api-access-f9dt5\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.230311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bddd8d20-d985-4c29-b82a-9bc75a6c40b9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.238906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"bddd8d20-d985-4c29-b82a-9bc75a6c40b9\") " pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.277305 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.430894 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.597799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.597901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.597954 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.598004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.598559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.598748 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.598825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.598963 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.599025 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.599110 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.599186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.599283 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5klmg\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg\") pod \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\" (UID: \"a499eea0-69d7-44b7-8839-dfdbfd9f872b\") " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.600077 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.600361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.600454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.603538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.625807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info" (OuterVolumeSpecName: "pod-info") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.625993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.629776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.632781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg" (OuterVolumeSpecName: "kube-api-access-5klmg") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "kube-api-access-5klmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.640133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data" (OuterVolumeSpecName: "config-data") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.665847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf" (OuterVolumeSpecName: "server-conf") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702294 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a499eea0-69d7-44b7-8839-dfdbfd9f872b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702326 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702337 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-server-conf\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702346 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5klmg\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-kube-api-access-5klmg\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702355 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a499eea0-69d7-44b7-8839-dfdbfd9f872b-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702363 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702372 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a499eea0-69d7-44b7-8839-dfdbfd9f872b-pod-info\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702401 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.702414 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.742577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a499eea0-69d7-44b7-8839-dfdbfd9f872b" (UID: "a499eea0-69d7-44b7-8839-dfdbfd9f872b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.758517 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.776120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.804393 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a499eea0-69d7-44b7-8839-dfdbfd9f872b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.804630 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.823735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bddd8d20-d985-4c29-b82a-9bc75a6c40b9","Type":"ContainerStarted","Data":"301ffcaa227c013d0cef9bea0d76cbcf8ff0fe4f2cc58a92f9782db8e1906944"} Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.826182 4749 generic.go:334] "Generic (PLEG): container finished" podID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerID="398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df" exitCode=0 Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.826215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerDied","Data":"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df"} Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.826245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.826264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a499eea0-69d7-44b7-8839-dfdbfd9f872b","Type":"ContainerDied","Data":"d97b9555c9cace15b14eb9eda24cd7d0c285b8c79a14f23ab46313e31dd0fae0"} Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.826283 4749 scope.go:117] "RemoveContainer" containerID="398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.906258 4749 scope.go:117] "RemoveContainer" containerID="be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.940224 4749 scope.go:117] "RemoveContainer" containerID="398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df" Feb 25 07:40:42 crc kubenswrapper[4749]: E0225 07:40:42.941092 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df\": container with ID starting with 398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df not found: ID does not exist" containerID="398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.941141 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df"} err="failed to get container status \"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df\": rpc error: code = NotFound desc = could not find container \"398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df\": container with ID starting with 398fa803b706c8a10181b4f1b5b2379b3a2ea469d84fa31165f0a9da60b837df not found: ID does not exist" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.941173 4749 scope.go:117] "RemoveContainer" containerID="be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c" Feb 25 07:40:42 crc kubenswrapper[4749]: E0225 07:40:42.941588 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c\": container with ID starting with be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c not found: ID does not exist" containerID="be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.941635 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c"} err="failed to get container status \"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c\": rpc error: code = NotFound desc = could not find container \"be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c\": container with ID starting with be2337c97328fc284d1f695c66d2eca637d9aa6dcda727c9d2ba93edda04322c not found: ID does not exist" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.955610 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.967026 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.994034 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:42 crc kubenswrapper[4749]: E0225 07:40:42.994818 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="rabbitmq" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.994911 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="rabbitmq" Feb 25 07:40:42 crc kubenswrapper[4749]: E0225 07:40:42.995186 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="setup-container" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.995271 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="setup-container" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.995579 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" containerName="rabbitmq" Feb 25 07:40:42 crc kubenswrapper[4749]: I0225 07:40:42.997188 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.013965 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014170 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014704 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014579 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014880 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.014918 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jxcd4" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.018298 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116622 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.116942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.117027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kzl9\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-kube-api-access-5kzl9\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.117106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.218949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kzl9\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-kube-api-access-5kzl9\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219576 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.219667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.220252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.220835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.221513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.225528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.228043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.228200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.239159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.248249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kzl9\" (UniqueName: \"kubernetes.io/projected/21dc0c76-2b5c-43cd-93e0-9f85eb8b102d-kube-api-access-5kzl9\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.251278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d\") " pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.334852 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a71a5d-2c51-4fd9-b1bf-f94393a3430e" path="/var/lib/kubelet/pods/78a71a5d-2c51-4fd9-b1bf-f94393a3430e/volumes" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.336378 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a499eea0-69d7-44b7-8839-dfdbfd9f872b" path="/var/lib/kubelet/pods/a499eea0-69d7-44b7-8839-dfdbfd9f872b/volumes" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.337947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.800142 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 25 07:40:43 crc kubenswrapper[4749]: W0225 07:40:43.804834 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21dc0c76_2b5c_43cd_93e0_9f85eb8b102d.slice/crio-12f279fc843859e60b2df6d30beb72e5e8ca59a468f7adecb0c944b365da7416 WatchSource:0}: Error finding container 12f279fc843859e60b2df6d30beb72e5e8ca59a468f7adecb0c944b365da7416: Status 404 returned error can't find the container with id 12f279fc843859e60b2df6d30beb72e5e8ca59a468f7adecb0c944b365da7416 Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.841235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d","Type":"ContainerStarted","Data":"12f279fc843859e60b2df6d30beb72e5e8ca59a468f7adecb0c944b365da7416"} Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.887239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.888735 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.900320 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 25 07:40:43 crc kubenswrapper[4749]: I0225 07:40:43.923660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039253 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqls\" (UniqueName: \"kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.039549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142570 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqls\" (UniqueName: \"kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142742 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.142925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.144227 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.144803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.145491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.146452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.147228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.147499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.175846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqls\" (UniqueName: \"kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls\") pod \"dnsmasq-dns-d558885bc-lghcx\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.460960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.851974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bddd8d20-d985-4c29-b82a-9bc75a6c40b9","Type":"ContainerStarted","Data":"f919879f36de3bf5b0ca69e8f9f18a3d766358dd0c276f7c8241aa84b28c3ea0"} Feb 25 07:40:44 crc kubenswrapper[4749]: I0225 07:40:44.961696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:40:44 crc kubenswrapper[4749]: W0225 07:40:44.964535 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae3ee7c6_6b05_4592_9974_2f3ac3926b9c.slice/crio-30b4c929e5ab1ca320f87bc61de346ea328c0d2f99adb9191a5d7a060b2e924f WatchSource:0}: Error finding container 30b4c929e5ab1ca320f87bc61de346ea328c0d2f99adb9191a5d7a060b2e924f: Status 404 returned error can't find the container with id 30b4c929e5ab1ca320f87bc61de346ea328c0d2f99adb9191a5d7a060b2e924f Feb 25 07:40:45 crc kubenswrapper[4749]: I0225 07:40:45.863200 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerID="d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb" exitCode=0 Feb 25 07:40:45 crc kubenswrapper[4749]: I0225 07:40:45.863299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-lghcx" event={"ID":"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c","Type":"ContainerDied","Data":"d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb"} Feb 25 07:40:45 crc kubenswrapper[4749]: I0225 07:40:45.865145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-lghcx" event={"ID":"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c","Type":"ContainerStarted","Data":"30b4c929e5ab1ca320f87bc61de346ea328c0d2f99adb9191a5d7a060b2e924f"} Feb 25 07:40:45 crc kubenswrapper[4749]: I0225 07:40:45.890680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d","Type":"ContainerStarted","Data":"407bc553aea84fdce00d6971157bcc88c90cd071bc175ec59301f34f5f8c18c9"} Feb 25 07:40:46 crc kubenswrapper[4749]: I0225 07:40:46.906642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-lghcx" event={"ID":"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c","Type":"ContainerStarted","Data":"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8"} Feb 25 07:40:46 crc kubenswrapper[4749]: I0225 07:40:46.930377 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-lghcx" podStartSLOduration=3.930328056 podStartE2EDuration="3.930328056s" podCreationTimestamp="2026-02-25 07:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:40:46.92717363 +0000 UTC m=+1400.288999700" watchObservedRunningTime="2026-02-25 07:40:46.930328056 +0000 UTC m=+1400.292154116" Feb 25 07:40:47 crc kubenswrapper[4749]: I0225 07:40:47.919740 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:51 crc kubenswrapper[4749]: I0225 07:40:51.671449 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:40:51 crc kubenswrapper[4749]: I0225 07:40:51.671842 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.462924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.552568 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.553015 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="dnsmasq-dns" containerID="cri-o://3062fe4ac8bc383f5e95dfc646935dd4a31508dceb833a33cad1010a456cc631" gracePeriod=10 Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.692960 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-lblvn"] Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.696430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.716228 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-lblvn"] Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794361 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54fjk\" (UniqueName: \"kubernetes.io/projected/c18a0e33-8fc2-4653-8a90-39b356b71af2-kube-api-access-54fjk\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-config\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.794465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54fjk\" (UniqueName: \"kubernetes.io/projected/c18a0e33-8fc2-4653-8a90-39b356b71af2-kube-api-access-54fjk\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-config\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.896665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.897492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.898008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.898506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.899134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.899622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.900789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18a0e33-8fc2-4653-8a90-39b356b71af2-config\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:54 crc kubenswrapper[4749]: I0225 07:40:54.924056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54fjk\" (UniqueName: \"kubernetes.io/projected/c18a0e33-8fc2-4653-8a90-39b356b71af2-kube-api-access-54fjk\") pod \"dnsmasq-dns-78c64bc9c5-lblvn\" (UID: \"c18a0e33-8fc2-4653-8a90-39b356b71af2\") " pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.040259 4749 generic.go:334] "Generic (PLEG): container finished" podID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerID="3062fe4ac8bc383f5e95dfc646935dd4a31508dceb833a33cad1010a456cc631" exitCode=0 Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.040405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" event={"ID":"52a97588-ac3f-4ee4-8f8b-43f305bf0392","Type":"ContainerDied","Data":"3062fe4ac8bc383f5e95dfc646935dd4a31508dceb833a33cad1010a456cc631"} Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.040546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" event={"ID":"52a97588-ac3f-4ee4-8f8b-43f305bf0392","Type":"ContainerDied","Data":"3f4c78d88205c0a635cb935f272d8aac547b7c3cdda561ecd820c3c5de690713"} Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.040567 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4c78d88205c0a635cb935f272d8aac547b7c3cdda561ecd820c3c5de690713" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.041550 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.049086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.200809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.200854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.201724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nhl\" (UniqueName: \"kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.201847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.202023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.202085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config\") pod \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\" (UID: \"52a97588-ac3f-4ee4-8f8b-43f305bf0392\") " Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.206194 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl" (OuterVolumeSpecName: "kube-api-access-w9nhl") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "kube-api-access-w9nhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.249078 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.253126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.262259 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config" (OuterVolumeSpecName: "config") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.269460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.272931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "52a97588-ac3f-4ee4-8f8b-43f305bf0392" (UID: "52a97588-ac3f-4ee4-8f8b-43f305bf0392"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304470 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304511 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nhl\" (UniqueName: \"kubernetes.io/projected/52a97588-ac3f-4ee4-8f8b-43f305bf0392-kube-api-access-w9nhl\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304536 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304547 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.304558 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/52a97588-ac3f-4ee4-8f8b-43f305bf0392-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:40:55 crc kubenswrapper[4749]: I0225 07:40:55.531747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-lblvn"] Feb 25 07:40:55 crc kubenswrapper[4749]: W0225 07:40:55.536476 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc18a0e33_8fc2_4653_8a90_39b356b71af2.slice/crio-b478382751d777406ffa2ed790ba4427cfaf49fbe8682386b91b97b0cf6a477f WatchSource:0}: Error finding container b478382751d777406ffa2ed790ba4427cfaf49fbe8682386b91b97b0cf6a477f: Status 404 returned error can't find the container with id b478382751d777406ffa2ed790ba4427cfaf49fbe8682386b91b97b0cf6a477f Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.055351 4749 generic.go:334] "Generic (PLEG): container finished" podID="c18a0e33-8fc2-4653-8a90-39b356b71af2" containerID="a1bfbf31eab8ba1c65d4bb95250cea24f055d14d2da06fb260291568ba44cd9b" exitCode=0 Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.055558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" event={"ID":"c18a0e33-8fc2-4653-8a90-39b356b71af2","Type":"ContainerDied","Data":"a1bfbf31eab8ba1c65d4bb95250cea24f055d14d2da06fb260291568ba44cd9b"} Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.055773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" event={"ID":"c18a0e33-8fc2-4653-8a90-39b356b71af2","Type":"ContainerStarted","Data":"b478382751d777406ffa2ed790ba4427cfaf49fbe8682386b91b97b0cf6a477f"} Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.055864 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-msfrz" Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.130350 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:40:56 crc kubenswrapper[4749]: I0225 07:40:56.141590 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-msfrz"] Feb 25 07:40:57 crc kubenswrapper[4749]: I0225 07:40:57.065253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" event={"ID":"c18a0e33-8fc2-4653-8a90-39b356b71af2","Type":"ContainerStarted","Data":"6929eb7f63e6eb16941918bf7ba1f9dee3ec85ad75ee5f7a67914cfe603e62b6"} Feb 25 07:40:57 crc kubenswrapper[4749]: I0225 07:40:57.065677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:40:57 crc kubenswrapper[4749]: I0225 07:40:57.095793 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" podStartSLOduration=3.095777084 podStartE2EDuration="3.095777084s" podCreationTimestamp="2026-02-25 07:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:40:57.089767788 +0000 UTC m=+1410.451593818" watchObservedRunningTime="2026-02-25 07:40:57.095777084 +0000 UTC m=+1410.457603104" Feb 25 07:40:57 crc kubenswrapper[4749]: I0225 07:40:57.350244 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" path="/var/lib/kubelet/pods/52a97588-ac3f-4ee4-8f8b-43f305bf0392/volumes" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.050904 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-lblvn" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.147869 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.148263 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-lghcx" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="dnsmasq-dns" containerID="cri-o://8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8" gracePeriod=10 Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.679399 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815685 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rqls\" (UniqueName: \"kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.815971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.816006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam\") pod \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\" (UID: \"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c\") " Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.830913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls" (OuterVolumeSpecName: "kube-api-access-8rqls") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "kube-api-access-8rqls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.863962 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.867787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.872247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.878637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.884294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.888534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config" (OuterVolumeSpecName: "config") pod "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" (UID: "ae3ee7c6-6b05-4592-9974-2f3ac3926b9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.917925 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.917957 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rqls\" (UniqueName: \"kubernetes.io/projected/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-kube-api-access-8rqls\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.917973 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-config\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.917984 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.917995 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.918005 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:05 crc kubenswrapper[4749]: I0225 07:41:05.918017 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.174367 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerID="8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8" exitCode=0 Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.174456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-lghcx" event={"ID":"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c","Type":"ContainerDied","Data":"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8"} Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.174499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-lghcx" event={"ID":"ae3ee7c6-6b05-4592-9974-2f3ac3926b9c","Type":"ContainerDied","Data":"30b4c929e5ab1ca320f87bc61de346ea328c0d2f99adb9191a5d7a060b2e924f"} Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.174528 4749 scope.go:117] "RemoveContainer" containerID="8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.174589 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-lghcx" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.229277 4749 scope.go:117] "RemoveContainer" containerID="d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.236046 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.245090 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-lghcx"] Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.261857 4749 scope.go:117] "RemoveContainer" containerID="8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8" Feb 25 07:41:06 crc kubenswrapper[4749]: E0225 07:41:06.266346 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8\": container with ID starting with 8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8 not found: ID does not exist" containerID="8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.266407 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8"} err="failed to get container status \"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8\": rpc error: code = NotFound desc = could not find container \"8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8\": container with ID starting with 8c2b8312cd0030ababb548a8d57e3eaa6a883d9eccf3bacbeec4d86688b910e8 not found: ID does not exist" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.266441 4749 scope.go:117] "RemoveContainer" containerID="d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb" Feb 25 07:41:06 crc kubenswrapper[4749]: E0225 07:41:06.266986 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb\": container with ID starting with d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb not found: ID does not exist" containerID="d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb" Feb 25 07:41:06 crc kubenswrapper[4749]: I0225 07:41:06.267027 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb"} err="failed to get container status \"d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb\": rpc error: code = NotFound desc = could not find container \"d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb\": container with ID starting with d4f9ae19261e6ea46407639139e6e453d0736569288ff1e5346974e715097fcb not found: ID does not exist" Feb 25 07:41:07 crc kubenswrapper[4749]: I0225 07:41:07.337350 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" path="/var/lib/kubelet/pods/ae3ee7c6-6b05-4592-9974-2f3ac3926b9c/volumes" Feb 25 07:41:17 crc kubenswrapper[4749]: I0225 07:41:17.301281 4749 generic.go:334] "Generic (PLEG): container finished" podID="bddd8d20-d985-4c29-b82a-9bc75a6c40b9" containerID="f919879f36de3bf5b0ca69e8f9f18a3d766358dd0c276f7c8241aa84b28c3ea0" exitCode=0 Feb 25 07:41:17 crc kubenswrapper[4749]: I0225 07:41:17.301356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bddd8d20-d985-4c29-b82a-9bc75a6c40b9","Type":"ContainerDied","Data":"f919879f36de3bf5b0ca69e8f9f18a3d766358dd0c276f7c8241aa84b28c3ea0"} Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.963167 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf"] Feb 25 07:41:18 crc kubenswrapper[4749]: E0225 07:41:17.964014 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964031 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: E0225 07:41:17.964055 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964062 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: E0225 07:41:17.964086 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="init" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964094 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="init" Feb 25 07:41:18 crc kubenswrapper[4749]: E0225 07:41:17.964111 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="init" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="init" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964331 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3ee7c6-6b05-4592-9974-2f3ac3926b9c" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.964346 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52a97588-ac3f-4ee4-8f8b-43f305bf0392" containerName="dnsmasq-dns" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.965140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.979040 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf"] Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.984252 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.984566 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.984905 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:17.985213 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.112379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.112623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vlpp\" (UniqueName: \"kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.112732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.112888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.214639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.214697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vlpp\" (UniqueName: \"kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.214723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.214764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.220032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.220239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.221988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.231560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vlpp\" (UniqueName: \"kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.316754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bddd8d20-d985-4c29-b82a-9bc75a6c40b9","Type":"ContainerStarted","Data":"150c8b75d90f99ccd45e69747bedceac85f8b9355440f3a894d366c3ea03f146"} Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.317906 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.328127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.352355 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.352335821 podStartE2EDuration="37.352335821s" podCreationTimestamp="2026-02-25 07:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:41:18.346853579 +0000 UTC m=+1431.708679639" watchObservedRunningTime="2026-02-25 07:41:18.352335821 +0000 UTC m=+1431.714161851" Feb 25 07:41:18 crc kubenswrapper[4749]: I0225 07:41:18.913552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf"] Feb 25 07:41:19 crc kubenswrapper[4749]: I0225 07:41:19.331548 4749 generic.go:334] "Generic (PLEG): container finished" podID="21dc0c76-2b5c-43cd-93e0-9f85eb8b102d" containerID="407bc553aea84fdce00d6971157bcc88c90cd071bc175ec59301f34f5f8c18c9" exitCode=0 Feb 25 07:41:19 crc kubenswrapper[4749]: I0225 07:41:19.335173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d","Type":"ContainerDied","Data":"407bc553aea84fdce00d6971157bcc88c90cd071bc175ec59301f34f5f8c18c9"} Feb 25 07:41:19 crc kubenswrapper[4749]: I0225 07:41:19.335234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" event={"ID":"911f3de9-9115-4be2-98f8-a1e26e35387a","Type":"ContainerStarted","Data":"7c393c52fa8e4ec8f53d273abf6adfebb68a1d8fcea5e3103e1f30fd99ea94ea"} Feb 25 07:41:20 crc kubenswrapper[4749]: I0225 07:41:20.347614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"21dc0c76-2b5c-43cd-93e0-9f85eb8b102d","Type":"ContainerStarted","Data":"c8562a5a3efc8861459eb04672abe02dd08dbe6f84f0a1b7c06186b1aafa520c"} Feb 25 07:41:20 crc kubenswrapper[4749]: I0225 07:41:20.349207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:41:20 crc kubenswrapper[4749]: I0225 07:41:20.376260 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.376244113 podStartE2EDuration="38.376244113s" podCreationTimestamp="2026-02-25 07:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 07:41:20.372364009 +0000 UTC m=+1433.734190029" watchObservedRunningTime="2026-02-25 07:41:20.376244113 +0000 UTC m=+1433.738070133" Feb 25 07:41:21 crc kubenswrapper[4749]: I0225 07:41:21.671767 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:41:21 crc kubenswrapper[4749]: I0225 07:41:21.672131 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:41:21 crc kubenswrapper[4749]: I0225 07:41:21.672307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:41:21 crc kubenswrapper[4749]: I0225 07:41:21.673254 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:41:21 crc kubenswrapper[4749]: I0225 07:41:21.673324 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b" gracePeriod=600 Feb 25 07:41:22 crc kubenswrapper[4749]: I0225 07:41:22.370359 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b" exitCode=0 Feb 25 07:41:22 crc kubenswrapper[4749]: I0225 07:41:22.370422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b"} Feb 25 07:41:22 crc kubenswrapper[4749]: I0225 07:41:22.370459 4749 scope.go:117] "RemoveContainer" containerID="e34470cf493c5ea9c3db0b2876750682218052e1a5e119ddc2e211d74634d3e6" Feb 25 07:41:28 crc kubenswrapper[4749]: I0225 07:41:28.772512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:41:29 crc kubenswrapper[4749]: I0225 07:41:29.446456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" event={"ID":"911f3de9-9115-4be2-98f8-a1e26e35387a","Type":"ContainerStarted","Data":"bbf0958eb33de45e23df74dffdc106553e747739a9aab4632588f473f3eeee78"} Feb 25 07:41:29 crc kubenswrapper[4749]: I0225 07:41:29.463367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28"} Feb 25 07:41:29 crc kubenswrapper[4749]: I0225 07:41:29.474726 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" podStartSLOduration=2.632649969 podStartE2EDuration="12.47470736s" podCreationTimestamp="2026-02-25 07:41:17 +0000 UTC" firstStartedPulling="2026-02-25 07:41:18.927856583 +0000 UTC m=+1432.289682603" lastFinishedPulling="2026-02-25 07:41:28.769913974 +0000 UTC m=+1442.131739994" observedRunningTime="2026-02-25 07:41:29.470877527 +0000 UTC m=+1442.832703557" watchObservedRunningTime="2026-02-25 07:41:29.47470736 +0000 UTC m=+1442.836533390" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.294837 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.299278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.309015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.433585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.433846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72ml\" (UniqueName: \"kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.434023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.535506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72ml\" (UniqueName: \"kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.535626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.535743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.536402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.536791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.558849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72ml\" (UniqueName: \"kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml\") pod \"redhat-operators-kgqjh\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:31 crc kubenswrapper[4749]: I0225 07:41:31.640886 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:32 crc kubenswrapper[4749]: I0225 07:41:32.118620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:32 crc kubenswrapper[4749]: W0225 07:41:32.124582 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60b97b31_0ac3_473d_8671_50886ecc3e68.slice/crio-28a4294ce653c69b46dbbff434d123f18396793ca4881412a6ebf2f1dbdad1c2 WatchSource:0}: Error finding container 28a4294ce653c69b46dbbff434d123f18396793ca4881412a6ebf2f1dbdad1c2: Status 404 returned error can't find the container with id 28a4294ce653c69b46dbbff434d123f18396793ca4881412a6ebf2f1dbdad1c2 Feb 25 07:41:32 crc kubenswrapper[4749]: I0225 07:41:32.280568 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 25 07:41:32 crc kubenswrapper[4749]: I0225 07:41:32.500432 4749 generic.go:334] "Generic (PLEG): container finished" podID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerID="522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1" exitCode=0 Feb 25 07:41:32 crc kubenswrapper[4749]: I0225 07:41:32.500664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerDied","Data":"522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1"} Feb 25 07:41:32 crc kubenswrapper[4749]: I0225 07:41:32.500791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerStarted","Data":"28a4294ce653c69b46dbbff434d123f18396793ca4881412a6ebf2f1dbdad1c2"} Feb 25 07:41:33 crc kubenswrapper[4749]: I0225 07:41:33.341803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 25 07:41:33 crc kubenswrapper[4749]: I0225 07:41:33.512366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerStarted","Data":"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48"} Feb 25 07:41:35 crc kubenswrapper[4749]: I0225 07:41:35.538839 4749 generic.go:334] "Generic (PLEG): container finished" podID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerID="a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48" exitCode=0 Feb 25 07:41:35 crc kubenswrapper[4749]: I0225 07:41:35.539375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerDied","Data":"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48"} Feb 25 07:41:35 crc kubenswrapper[4749]: I0225 07:41:35.542657 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:41:37 crc kubenswrapper[4749]: I0225 07:41:37.579467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerStarted","Data":"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0"} Feb 25 07:41:37 crc kubenswrapper[4749]: I0225 07:41:37.628282 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kgqjh" podStartSLOduration=2.461791665 podStartE2EDuration="6.628254096s" podCreationTimestamp="2026-02-25 07:41:31 +0000 UTC" firstStartedPulling="2026-02-25 07:41:32.502113948 +0000 UTC m=+1445.863939968" lastFinishedPulling="2026-02-25 07:41:36.668576369 +0000 UTC m=+1450.030402399" observedRunningTime="2026-02-25 07:41:37.604195424 +0000 UTC m=+1450.966021444" watchObservedRunningTime="2026-02-25 07:41:37.628254096 +0000 UTC m=+1450.990080156" Feb 25 07:41:40 crc kubenswrapper[4749]: I0225 07:41:40.636910 4749 generic.go:334] "Generic (PLEG): container finished" podID="911f3de9-9115-4be2-98f8-a1e26e35387a" containerID="bbf0958eb33de45e23df74dffdc106553e747739a9aab4632588f473f3eeee78" exitCode=0 Feb 25 07:41:40 crc kubenswrapper[4749]: I0225 07:41:40.637040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" event={"ID":"911f3de9-9115-4be2-98f8-a1e26e35387a","Type":"ContainerDied","Data":"bbf0958eb33de45e23df74dffdc106553e747739a9aab4632588f473f3eeee78"} Feb 25 07:41:41 crc kubenswrapper[4749]: I0225 07:41:41.642000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:41 crc kubenswrapper[4749]: I0225 07:41:41.643096 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.162552 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.261087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle\") pod \"911f3de9-9115-4be2-98f8-a1e26e35387a\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.261442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory\") pod \"911f3de9-9115-4be2-98f8-a1e26e35387a\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.261546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam\") pod \"911f3de9-9115-4be2-98f8-a1e26e35387a\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.261644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vlpp\" (UniqueName: \"kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp\") pod \"911f3de9-9115-4be2-98f8-a1e26e35387a\" (UID: \"911f3de9-9115-4be2-98f8-a1e26e35387a\") " Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.267642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp" (OuterVolumeSpecName: "kube-api-access-8vlpp") pod "911f3de9-9115-4be2-98f8-a1e26e35387a" (UID: "911f3de9-9115-4be2-98f8-a1e26e35387a"). InnerVolumeSpecName "kube-api-access-8vlpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.267922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "911f3de9-9115-4be2-98f8-a1e26e35387a" (UID: "911f3de9-9115-4be2-98f8-a1e26e35387a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.299781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "911f3de9-9115-4be2-98f8-a1e26e35387a" (UID: "911f3de9-9115-4be2-98f8-a1e26e35387a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.312025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory" (OuterVolumeSpecName: "inventory") pod "911f3de9-9115-4be2-98f8-a1e26e35387a" (UID: "911f3de9-9115-4be2-98f8-a1e26e35387a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.365477 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.365524 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.365537 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/911f3de9-9115-4be2-98f8-a1e26e35387a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.365550 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vlpp\" (UniqueName: \"kubernetes.io/projected/911f3de9-9115-4be2-98f8-a1e26e35387a-kube-api-access-8vlpp\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.679636 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.683197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf" event={"ID":"911f3de9-9115-4be2-98f8-a1e26e35387a","Type":"ContainerDied","Data":"7c393c52fa8e4ec8f53d273abf6adfebb68a1d8fcea5e3103e1f30fd99ea94ea"} Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.683249 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c393c52fa8e4ec8f53d273abf6adfebb68a1d8fcea5e3103e1f30fd99ea94ea" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.727162 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kgqjh" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="registry-server" probeResult="failure" output=< Feb 25 07:41:42 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:41:42 crc kubenswrapper[4749]: > Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.779400 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724"] Feb 25 07:41:42 crc kubenswrapper[4749]: E0225 07:41:42.779851 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911f3de9-9115-4be2-98f8-a1e26e35387a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.779870 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="911f3de9-9115-4be2-98f8-a1e26e35387a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.780048 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="911f3de9-9115-4be2-98f8-a1e26e35387a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.780639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.784493 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.784685 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.784769 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.784875 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.787425 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724"] Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.875551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.875693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq24p\" (UniqueName: \"kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.875745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.977162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.977350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq24p\" (UniqueName: \"kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.977431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.983470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:42 crc kubenswrapper[4749]: I0225 07:41:42.984710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:43 crc kubenswrapper[4749]: I0225 07:41:43.000910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq24p\" (UniqueName: \"kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hx724\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:43 crc kubenswrapper[4749]: I0225 07:41:43.102903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:43 crc kubenswrapper[4749]: I0225 07:41:43.756621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724"] Feb 25 07:41:44 crc kubenswrapper[4749]: I0225 07:41:44.712314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" event={"ID":"9653d4b7-617d-4ca2-a596-3f4ab7086b05","Type":"ContainerStarted","Data":"b33bc75a870a6df979c702429ffc12ff8e40ea5d562d2f5388bfa852a94219bd"} Feb 25 07:41:44 crc kubenswrapper[4749]: I0225 07:41:44.712827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" event={"ID":"9653d4b7-617d-4ca2-a596-3f4ab7086b05","Type":"ContainerStarted","Data":"511d3278adf1cf6803f4737714d9699e4b70d0886ef39c54f6e941037dbc5420"} Feb 25 07:41:44 crc kubenswrapper[4749]: I0225 07:41:44.743103 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" podStartSLOduration=2.246182534 podStartE2EDuration="2.743033954s" podCreationTimestamp="2026-02-25 07:41:42 +0000 UTC" firstStartedPulling="2026-02-25 07:41:43.758313262 +0000 UTC m=+1457.120139292" lastFinishedPulling="2026-02-25 07:41:44.255164662 +0000 UTC m=+1457.616990712" observedRunningTime="2026-02-25 07:41:44.728884712 +0000 UTC m=+1458.090710772" watchObservedRunningTime="2026-02-25 07:41:44.743033954 +0000 UTC m=+1458.104860004" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.504671 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.508696 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.519552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.663302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.663478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g76xn\" (UniqueName: \"kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.663710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.766429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g76xn\" (UniqueName: \"kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.766837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.766969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.767323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.767343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.798568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g76xn\" (UniqueName: \"kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn\") pod \"certified-operators-cc62d\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:45 crc kubenswrapper[4749]: I0225 07:41:45.873041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:46 crc kubenswrapper[4749]: I0225 07:41:46.164170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:46 crc kubenswrapper[4749]: I0225 07:41:46.736791 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerID="fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672" exitCode=0 Feb 25 07:41:46 crc kubenswrapper[4749]: I0225 07:41:46.737037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerDied","Data":"fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672"} Feb 25 07:41:46 crc kubenswrapper[4749]: I0225 07:41:46.737062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerStarted","Data":"97304c4325dff1a935797ffc3013864f36b1df39a1e29bc34b32dd4373859234"} Feb 25 07:41:47 crc kubenswrapper[4749]: I0225 07:41:47.749301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerStarted","Data":"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3"} Feb 25 07:41:47 crc kubenswrapper[4749]: I0225 07:41:47.752907 4749 generic.go:334] "Generic (PLEG): container finished" podID="9653d4b7-617d-4ca2-a596-3f4ab7086b05" containerID="b33bc75a870a6df979c702429ffc12ff8e40ea5d562d2f5388bfa852a94219bd" exitCode=0 Feb 25 07:41:47 crc kubenswrapper[4749]: I0225 07:41:47.752982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" event={"ID":"9653d4b7-617d-4ca2-a596-3f4ab7086b05","Type":"ContainerDied","Data":"b33bc75a870a6df979c702429ffc12ff8e40ea5d562d2f5388bfa852a94219bd"} Feb 25 07:41:48 crc kubenswrapper[4749]: I0225 07:41:48.763714 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerID="41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3" exitCode=0 Feb 25 07:41:48 crc kubenswrapper[4749]: I0225 07:41:48.763826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerDied","Data":"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3"} Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.224395 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.334089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory\") pod \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.334272 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq24p\" (UniqueName: \"kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p\") pod \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.334301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam\") pod \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\" (UID: \"9653d4b7-617d-4ca2-a596-3f4ab7086b05\") " Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.339082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p" (OuterVolumeSpecName: "kube-api-access-rq24p") pod "9653d4b7-617d-4ca2-a596-3f4ab7086b05" (UID: "9653d4b7-617d-4ca2-a596-3f4ab7086b05"). InnerVolumeSpecName "kube-api-access-rq24p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.362055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9653d4b7-617d-4ca2-a596-3f4ab7086b05" (UID: "9653d4b7-617d-4ca2-a596-3f4ab7086b05"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.373647 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory" (OuterVolumeSpecName: "inventory") pod "9653d4b7-617d-4ca2-a596-3f4ab7086b05" (UID: "9653d4b7-617d-4ca2-a596-3f4ab7086b05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.436284 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.436314 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq24p\" (UniqueName: \"kubernetes.io/projected/9653d4b7-617d-4ca2-a596-3f4ab7086b05-kube-api-access-rq24p\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.436325 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9653d4b7-617d-4ca2-a596-3f4ab7086b05-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.785263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" event={"ID":"9653d4b7-617d-4ca2-a596-3f4ab7086b05","Type":"ContainerDied","Data":"511d3278adf1cf6803f4737714d9699e4b70d0886ef39c54f6e941037dbc5420"} Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.785312 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511d3278adf1cf6803f4737714d9699e4b70d0886ef39c54f6e941037dbc5420" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.785288 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hx724" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.789637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerStarted","Data":"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd"} Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.868475 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq"] Feb 25 07:41:49 crc kubenswrapper[4749]: E0225 07:41:49.869180 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9653d4b7-617d-4ca2-a596-3f4ab7086b05" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.869202 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9653d4b7-617d-4ca2-a596-3f4ab7086b05" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.869384 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9653d4b7-617d-4ca2-a596-3f4ab7086b05" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.870037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.883409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq"] Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.946501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.946634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.946759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.946862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6wbf\" (UniqueName: \"kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.947552 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.948069 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.948400 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:41:49 crc kubenswrapper[4749]: I0225 07:41:49.948957 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.048619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.048725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.048778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6wbf\" (UniqueName: \"kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.048847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.052918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.053027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.053897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.067276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6wbf\" (UniqueName: \"kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.257488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.815564 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq"] Feb 25 07:41:50 crc kubenswrapper[4749]: W0225 07:41:50.819282 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42b06635_d369_4299_92f6_e912f4d811df.slice/crio-258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd WatchSource:0}: Error finding container 258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd: Status 404 returned error can't find the container with id 258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd Feb 25 07:41:50 crc kubenswrapper[4749]: I0225 07:41:50.822382 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cc62d" podStartSLOduration=3.213685816 podStartE2EDuration="5.822364262s" podCreationTimestamp="2026-02-25 07:41:45 +0000 UTC" firstStartedPulling="2026-02-25 07:41:46.739712627 +0000 UTC m=+1460.101538687" lastFinishedPulling="2026-02-25 07:41:49.348391113 +0000 UTC m=+1462.710217133" observedRunningTime="2026-02-25 07:41:50.822053775 +0000 UTC m=+1464.183879795" watchObservedRunningTime="2026-02-25 07:41:50.822364262 +0000 UTC m=+1464.184190282" Feb 25 07:41:51 crc kubenswrapper[4749]: I0225 07:41:51.690268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:51 crc kubenswrapper[4749]: I0225 07:41:51.751786 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:51 crc kubenswrapper[4749]: I0225 07:41:51.808781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" event={"ID":"42b06635-d369-4299-92f6-e912f4d811df","Type":"ContainerStarted","Data":"ad126335bb4abf1a90ec4ed16c209ac6d778fc5ce40b9da4c7382d586a575b0c"} Feb 25 07:41:51 crc kubenswrapper[4749]: I0225 07:41:51.808843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" event={"ID":"42b06635-d369-4299-92f6-e912f4d811df","Type":"ContainerStarted","Data":"258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd"} Feb 25 07:41:51 crc kubenswrapper[4749]: I0225 07:41:51.842061 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" podStartSLOduration=2.326176749 podStartE2EDuration="2.842031979s" podCreationTimestamp="2026-02-25 07:41:49 +0000 UTC" firstStartedPulling="2026-02-25 07:41:50.821518151 +0000 UTC m=+1464.183344171" lastFinishedPulling="2026-02-25 07:41:51.337373371 +0000 UTC m=+1464.699199401" observedRunningTime="2026-02-25 07:41:51.826071894 +0000 UTC m=+1465.187897944" watchObservedRunningTime="2026-02-25 07:41:51.842031979 +0000 UTC m=+1465.203858029" Feb 25 07:41:52 crc kubenswrapper[4749]: I0225 07:41:52.872450 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:52 crc kubenswrapper[4749]: I0225 07:41:52.873200 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kgqjh" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="registry-server" containerID="cri-o://088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0" gracePeriod=2 Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.446351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.538900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content\") pod \"60b97b31-0ac3-473d-8671-50886ecc3e68\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.538979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72ml\" (UniqueName: \"kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml\") pod \"60b97b31-0ac3-473d-8671-50886ecc3e68\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.539149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities\") pod \"60b97b31-0ac3-473d-8671-50886ecc3e68\" (UID: \"60b97b31-0ac3-473d-8671-50886ecc3e68\") " Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.539708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities" (OuterVolumeSpecName: "utilities") pod "60b97b31-0ac3-473d-8671-50886ecc3e68" (UID: "60b97b31-0ac3-473d-8671-50886ecc3e68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.544764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml" (OuterVolumeSpecName: "kube-api-access-f72ml") pod "60b97b31-0ac3-473d-8671-50886ecc3e68" (UID: "60b97b31-0ac3-473d-8671-50886ecc3e68"). InnerVolumeSpecName "kube-api-access-f72ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.642153 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72ml\" (UniqueName: \"kubernetes.io/projected/60b97b31-0ac3-473d-8671-50886ecc3e68-kube-api-access-f72ml\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.642186 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.658370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60b97b31-0ac3-473d-8671-50886ecc3e68" (UID: "60b97b31-0ac3-473d-8671-50886ecc3e68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.743259 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b97b31-0ac3-473d-8671-50886ecc3e68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.834745 4749 generic.go:334] "Generic (PLEG): container finished" podID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerID="088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0" exitCode=0 Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.834794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerDied","Data":"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0"} Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.834839 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgqjh" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.834870 4749 scope.go:117] "RemoveContainer" containerID="088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.834855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgqjh" event={"ID":"60b97b31-0ac3-473d-8671-50886ecc3e68","Type":"ContainerDied","Data":"28a4294ce653c69b46dbbff434d123f18396793ca4881412a6ebf2f1dbdad1c2"} Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.864832 4749 scope.go:117] "RemoveContainer" containerID="a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.883312 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.894241 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kgqjh"] Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.909855 4749 scope.go:117] "RemoveContainer" containerID="522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.950964 4749 scope.go:117] "RemoveContainer" containerID="088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0" Feb 25 07:41:53 crc kubenswrapper[4749]: E0225 07:41:53.951552 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0\": container with ID starting with 088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0 not found: ID does not exist" containerID="088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.951581 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0"} err="failed to get container status \"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0\": rpc error: code = NotFound desc = could not find container \"088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0\": container with ID starting with 088e0d62a2fc616939e3c6390760efa5995bd27bb645f5c05abf6c1be5b35de0 not found: ID does not exist" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.951627 4749 scope.go:117] "RemoveContainer" containerID="a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48" Feb 25 07:41:53 crc kubenswrapper[4749]: E0225 07:41:53.952201 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48\": container with ID starting with a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48 not found: ID does not exist" containerID="a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.952340 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48"} err="failed to get container status \"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48\": rpc error: code = NotFound desc = could not find container \"a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48\": container with ID starting with a41378d60f06a242125604dfc52369f150f3b1919aca034b822b847fd7362c48 not found: ID does not exist" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.952436 4749 scope.go:117] "RemoveContainer" containerID="522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1" Feb 25 07:41:53 crc kubenswrapper[4749]: E0225 07:41:53.952964 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1\": container with ID starting with 522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1 not found: ID does not exist" containerID="522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1" Feb 25 07:41:53 crc kubenswrapper[4749]: I0225 07:41:53.952989 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1"} err="failed to get container status \"522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1\": rpc error: code = NotFound desc = could not find container \"522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1\": container with ID starting with 522744ec6f1c92ad48b3ea70678ba03fd63a3b103dc5d3494bcf17c4ee8e8eb1 not found: ID does not exist" Feb 25 07:41:55 crc kubenswrapper[4749]: I0225 07:41:55.335656 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" path="/var/lib/kubelet/pods/60b97b31-0ac3-473d-8671-50886ecc3e68/volumes" Feb 25 07:41:55 crc kubenswrapper[4749]: I0225 07:41:55.873806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:55 crc kubenswrapper[4749]: I0225 07:41:55.873889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:55 crc kubenswrapper[4749]: I0225 07:41:55.962983 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:56 crc kubenswrapper[4749]: I0225 07:41:56.923488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:58 crc kubenswrapper[4749]: I0225 07:41:58.075576 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:58 crc kubenswrapper[4749]: I0225 07:41:58.894971 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cc62d" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="registry-server" containerID="cri-o://142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd" gracePeriod=2 Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.433889 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.571582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content\") pod \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.571738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g76xn\" (UniqueName: \"kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn\") pod \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.571780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities\") pod \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\" (UID: \"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9\") " Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.573233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities" (OuterVolumeSpecName: "utilities") pod "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" (UID: "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.584551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn" (OuterVolumeSpecName: "kube-api-access-g76xn") pod "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" (UID: "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9"). InnerVolumeSpecName "kube-api-access-g76xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.639501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" (UID: "ec402cbf-5378-4cf4-841a-4f8ed25a6bc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.676935 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.676977 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g76xn\" (UniqueName: \"kubernetes.io/projected/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-kube-api-access-g76xn\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.676991 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.909951 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerID="142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd" exitCode=0 Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.910005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerDied","Data":"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd"} Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.910037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cc62d" event={"ID":"ec402cbf-5378-4cf4-841a-4f8ed25a6bc9","Type":"ContainerDied","Data":"97304c4325dff1a935797ffc3013864f36b1df39a1e29bc34b32dd4373859234"} Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.910059 4749 scope.go:117] "RemoveContainer" containerID="142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.910055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cc62d" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.951026 4749 scope.go:117] "RemoveContainer" containerID="41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3" Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.958211 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.968457 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cc62d"] Feb 25 07:41:59 crc kubenswrapper[4749]: I0225 07:41:59.979130 4749 scope.go:117] "RemoveContainer" containerID="fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.028070 4749 scope.go:117] "RemoveContainer" containerID="142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.028522 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd\": container with ID starting with 142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd not found: ID does not exist" containerID="142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.028556 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd"} err="failed to get container status \"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd\": rpc error: code = NotFound desc = could not find container \"142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd\": container with ID starting with 142f05610ef4d887b57b989a7836989161bc7ecc688c3ea6b50889ef6cbc89cd not found: ID does not exist" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.028583 4749 scope.go:117] "RemoveContainer" containerID="41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.028937 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3\": container with ID starting with 41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3 not found: ID does not exist" containerID="41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.028963 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3"} err="failed to get container status \"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3\": rpc error: code = NotFound desc = could not find container \"41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3\": container with ID starting with 41de6357f32272e5f1733fac39b512289850a954767a9acb1a1e37d1287aa7b3 not found: ID does not exist" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.028980 4749 scope.go:117] "RemoveContainer" containerID="fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.029241 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672\": container with ID starting with fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672 not found: ID does not exist" containerID="fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.029265 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672"} err="failed to get container status \"fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672\": rpc error: code = NotFound desc = could not find container \"fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672\": container with ID starting with fd58c43fdf3bbce7493cc0211ac5e4a89c5285e2d79b512797e8334b464c5672 not found: ID does not exist" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.151175 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533422-qqsch"] Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.151903 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="extract-content" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.151942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="extract-content" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.151965 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="extract-utilities" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.151978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="extract-utilities" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.152031 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="extract-utilities" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="extract-utilities" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.152071 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152083 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.152115 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152127 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: E0225 07:42:00.152153 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="extract-content" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="extract-content" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152474 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.152501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b97b31-0ac3-473d-8671-50886ecc3e68" containerName="registry-server" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.153474 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.156787 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.157374 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.157973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.165206 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533422-qqsch"] Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.288441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66n5t\" (UniqueName: \"kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t\") pod \"auto-csr-approver-29533422-qqsch\" (UID: \"1ea2508a-7716-4440-b6c5-3acb308707ea\") " pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.391857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66n5t\" (UniqueName: \"kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t\") pod \"auto-csr-approver-29533422-qqsch\" (UID: \"1ea2508a-7716-4440-b6c5-3acb308707ea\") " pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.420776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66n5t\" (UniqueName: \"kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t\") pod \"auto-csr-approver-29533422-qqsch\" (UID: \"1ea2508a-7716-4440-b6c5-3acb308707ea\") " pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.480206 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:00 crc kubenswrapper[4749]: I0225 07:42:00.963805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533422-qqsch"] Feb 25 07:42:00 crc kubenswrapper[4749]: W0225 07:42:00.983348 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ea2508a_7716_4440_b6c5_3acb308707ea.slice/crio-0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c WatchSource:0}: Error finding container 0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c: Status 404 returned error can't find the container with id 0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c Feb 25 07:42:01 crc kubenswrapper[4749]: I0225 07:42:01.345120 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec402cbf-5378-4cf4-841a-4f8ed25a6bc9" path="/var/lib/kubelet/pods/ec402cbf-5378-4cf4-841a-4f8ed25a6bc9/volumes" Feb 25 07:42:01 crc kubenswrapper[4749]: I0225 07:42:01.941351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533422-qqsch" event={"ID":"1ea2508a-7716-4440-b6c5-3acb308707ea","Type":"ContainerStarted","Data":"0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c"} Feb 25 07:42:02 crc kubenswrapper[4749]: I0225 07:42:02.955824 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ea2508a-7716-4440-b6c5-3acb308707ea" containerID="78a27a57542f77edef9ca8e2f387c44669bd84acb03d5da544768c9a1da03655" exitCode=0 Feb 25 07:42:02 crc kubenswrapper[4749]: I0225 07:42:02.955989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533422-qqsch" event={"ID":"1ea2508a-7716-4440-b6c5-3acb308707ea","Type":"ContainerDied","Data":"78a27a57542f77edef9ca8e2f387c44669bd84acb03d5da544768c9a1da03655"} Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.332621 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.478891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66n5t\" (UniqueName: \"kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t\") pod \"1ea2508a-7716-4440-b6c5-3acb308707ea\" (UID: \"1ea2508a-7716-4440-b6c5-3acb308707ea\") " Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.502823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t" (OuterVolumeSpecName: "kube-api-access-66n5t") pod "1ea2508a-7716-4440-b6c5-3acb308707ea" (UID: "1ea2508a-7716-4440-b6c5-3acb308707ea"). InnerVolumeSpecName "kube-api-access-66n5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.582184 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66n5t\" (UniqueName: \"kubernetes.io/projected/1ea2508a-7716-4440-b6c5-3acb308707ea-kube-api-access-66n5t\") on node \"crc\" DevicePath \"\"" Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.980457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533422-qqsch" event={"ID":"1ea2508a-7716-4440-b6c5-3acb308707ea","Type":"ContainerDied","Data":"0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c"} Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.980515 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7222dff13c22453bb8c458be3de3bf9a627a975bf4de4488d6aa4b13c3205c" Feb 25 07:42:04 crc kubenswrapper[4749]: I0225 07:42:04.980586 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533422-qqsch" Feb 25 07:42:05 crc kubenswrapper[4749]: I0225 07:42:05.455551 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533416-7mtkv"] Feb 25 07:42:05 crc kubenswrapper[4749]: I0225 07:42:05.464990 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533416-7mtkv"] Feb 25 07:42:07 crc kubenswrapper[4749]: I0225 07:42:07.341723 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f" path="/var/lib/kubelet/pods/4b16ca14-bc0d-48c3-aa38-0af73eb7ca6f/volumes" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.474352 4749 scope.go:117] "RemoveContainer" containerID="ddb7923ed52544226de56285cb4e6acb84e010d2f881d8f2d34079d76daf5302" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.511975 4749 scope.go:117] "RemoveContainer" containerID="f813cb8cf0c5a0e1efe5e8d0320065344505244b9770bc5d44ccafb102e9d908" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.562205 4749 scope.go:117] "RemoveContainer" containerID="5f1371ad6e8cd5b13bf588e7e1ef5110215cf50a732ed6146ffd02bdd2fc4eec" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.614731 4749 scope.go:117] "RemoveContainer" containerID="9a1cf854c42749ee2e4aa4d1793464acb638c97f9dd760d8a76adcb767cda8b9" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.672536 4749 scope.go:117] "RemoveContainer" containerID="4bbc63636967ccf784c0e71258b47391e9620da4850f647fcdd366de88679cd1" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.737103 4749 scope.go:117] "RemoveContainer" containerID="14150f4048543b1bea80e7f17add36062f65116b769b582813183a83c08b3432" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.759726 4749 scope.go:117] "RemoveContainer" containerID="d8dcb65447dffa4d38c02e1932c451659fb9b77ae7521f730abb90a55ab9a9cf" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.787906 4749 scope.go:117] "RemoveContainer" containerID="231c5baeb8aeacabac6c666088561386ea18543aef1702125aca038da0fdd3d8" Feb 25 07:42:41 crc kubenswrapper[4749]: I0225 07:42:41.810480 4749 scope.go:117] "RemoveContainer" containerID="233fe70f6ad5d3572d4afc1feaeef24255f05ce045d2cbabbf5c417bd040eddd" Feb 25 07:43:42 crc kubenswrapper[4749]: I0225 07:43:42.031449 4749 scope.go:117] "RemoveContainer" containerID="4d8d6ff5f005eda3d7bac28c9113d2df900871e30469621e74315136f4865155" Feb 25 07:43:42 crc kubenswrapper[4749]: I0225 07:43:42.071564 4749 scope.go:117] "RemoveContainer" containerID="daf1229746c30846ae1262c9edc3c5358455d6adaedb0192eca6cb9a11e5168e" Feb 25 07:43:42 crc kubenswrapper[4749]: I0225 07:43:42.125204 4749 scope.go:117] "RemoveContainer" containerID="269cf05610217219000c47b8623644aac4dfeb65efba19487d9ef0872efbf736" Feb 25 07:43:51 crc kubenswrapper[4749]: I0225 07:43:51.672453 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:43:51 crc kubenswrapper[4749]: I0225 07:43:51.672990 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.162389 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533424-859g2"] Feb 25 07:44:00 crc kubenswrapper[4749]: E0225 07:44:00.163483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea2508a-7716-4440-b6c5-3acb308707ea" containerName="oc" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.163503 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea2508a-7716-4440-b6c5-3acb308707ea" containerName="oc" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.163760 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea2508a-7716-4440-b6c5-3acb308707ea" containerName="oc" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.164490 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.167221 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.167644 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.167740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.175178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533424-859g2"] Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.262227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xn4\" (UniqueName: \"kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4\") pod \"auto-csr-approver-29533424-859g2\" (UID: \"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9\") " pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.364392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xn4\" (UniqueName: \"kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4\") pod \"auto-csr-approver-29533424-859g2\" (UID: \"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9\") " pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.386515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xn4\" (UniqueName: \"kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4\") pod \"auto-csr-approver-29533424-859g2\" (UID: \"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9\") " pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.485707 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:00 crc kubenswrapper[4749]: I0225 07:44:00.968696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533424-859g2"] Feb 25 07:44:01 crc kubenswrapper[4749]: I0225 07:44:01.400427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533424-859g2" event={"ID":"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9","Type":"ContainerStarted","Data":"a4f14b23c3b5fd65927bb1a42acfaa6f781d1ed30170a63ac985a50d3b627c9d"} Feb 25 07:44:02 crc kubenswrapper[4749]: I0225 07:44:02.410091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533424-859g2" event={"ID":"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9","Type":"ContainerStarted","Data":"2f193ce21ec70e7769bb6c86b17aa99fc42a718f5c1d1010076f471ea5987ebd"} Feb 25 07:44:02 crc kubenswrapper[4749]: I0225 07:44:02.431658 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533424-859g2" podStartSLOduration=1.536986809 podStartE2EDuration="2.43163067s" podCreationTimestamp="2026-02-25 07:44:00 +0000 UTC" firstStartedPulling="2026-02-25 07:44:00.974843386 +0000 UTC m=+1594.336669426" lastFinishedPulling="2026-02-25 07:44:01.869487257 +0000 UTC m=+1595.231313287" observedRunningTime="2026-02-25 07:44:02.422446858 +0000 UTC m=+1595.784272878" watchObservedRunningTime="2026-02-25 07:44:02.43163067 +0000 UTC m=+1595.793456700" Feb 25 07:44:03 crc kubenswrapper[4749]: I0225 07:44:03.419811 4749 generic.go:334] "Generic (PLEG): container finished" podID="7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" containerID="2f193ce21ec70e7769bb6c86b17aa99fc42a718f5c1d1010076f471ea5987ebd" exitCode=0 Feb 25 07:44:03 crc kubenswrapper[4749]: I0225 07:44:03.419864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533424-859g2" event={"ID":"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9","Type":"ContainerDied","Data":"2f193ce21ec70e7769bb6c86b17aa99fc42a718f5c1d1010076f471ea5987ebd"} Feb 25 07:44:04 crc kubenswrapper[4749]: I0225 07:44:04.790135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:04 crc kubenswrapper[4749]: I0225 07:44:04.866254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7xn4\" (UniqueName: \"kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4\") pod \"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9\" (UID: \"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9\") " Feb 25 07:44:04 crc kubenswrapper[4749]: I0225 07:44:04.873034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4" (OuterVolumeSpecName: "kube-api-access-t7xn4") pod "7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" (UID: "7c4a095a-b2c0-4f5b-9c6f-36438f2570b9"). InnerVolumeSpecName "kube-api-access-t7xn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:44:04 crc kubenswrapper[4749]: I0225 07:44:04.968789 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7xn4\" (UniqueName: \"kubernetes.io/projected/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9-kube-api-access-t7xn4\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:05 crc kubenswrapper[4749]: I0225 07:44:05.443325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533424-859g2" event={"ID":"7c4a095a-b2c0-4f5b-9c6f-36438f2570b9","Type":"ContainerDied","Data":"a4f14b23c3b5fd65927bb1a42acfaa6f781d1ed30170a63ac985a50d3b627c9d"} Feb 25 07:44:05 crc kubenswrapper[4749]: I0225 07:44:05.444021 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f14b23c3b5fd65927bb1a42acfaa6f781d1ed30170a63ac985a50d3b627c9d" Feb 25 07:44:05 crc kubenswrapper[4749]: I0225 07:44:05.443405 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533424-859g2" Feb 25 07:44:05 crc kubenswrapper[4749]: I0225 07:44:05.528458 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533418-qwhq2"] Feb 25 07:44:05 crc kubenswrapper[4749]: I0225 07:44:05.544857 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533418-qwhq2"] Feb 25 07:44:07 crc kubenswrapper[4749]: I0225 07:44:07.339499 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a0afce-d86a-422c-8a74-b2e2554837d8" path="/var/lib/kubelet/pods/90a0afce-d86a-422c-8a74-b2e2554837d8/volumes" Feb 25 07:44:21 crc kubenswrapper[4749]: I0225 07:44:21.673475 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:44:21 crc kubenswrapper[4749]: I0225 07:44:21.674377 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.834620 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:28 crc kubenswrapper[4749]: E0225 07:44:28.837543 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" containerName="oc" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.837569 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" containerName="oc" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.837899 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" containerName="oc" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.839657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.867216 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.959778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8zt5\" (UniqueName: \"kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.959871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:28 crc kubenswrapper[4749]: I0225 07:44:28.960062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.062786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.062898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8zt5\" (UniqueName: \"kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.062981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.063215 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.063644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.098783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8zt5\" (UniqueName: \"kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5\") pod \"redhat-marketplace-bnbwx\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.161645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.689289 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:29 crc kubenswrapper[4749]: I0225 07:44:29.728894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerStarted","Data":"07970082c187ef76092d508266b8ce3db7bc93cec15d464bc011203153537969"} Feb 25 07:44:30 crc kubenswrapper[4749]: I0225 07:44:30.743788 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerID="d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9" exitCode=0 Feb 25 07:44:30 crc kubenswrapper[4749]: I0225 07:44:30.744093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerDied","Data":"d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9"} Feb 25 07:44:32 crc kubenswrapper[4749]: I0225 07:44:32.766058 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerID="280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c" exitCode=0 Feb 25 07:44:32 crc kubenswrapper[4749]: I0225 07:44:32.766304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerDied","Data":"280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c"} Feb 25 07:44:33 crc kubenswrapper[4749]: I0225 07:44:33.777235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerStarted","Data":"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d"} Feb 25 07:44:33 crc kubenswrapper[4749]: I0225 07:44:33.813616 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnbwx" podStartSLOduration=3.227458745 podStartE2EDuration="5.813582972s" podCreationTimestamp="2026-02-25 07:44:28 +0000 UTC" firstStartedPulling="2026-02-25 07:44:30.746458607 +0000 UTC m=+1624.108284627" lastFinishedPulling="2026-02-25 07:44:33.332582824 +0000 UTC m=+1626.694408854" observedRunningTime="2026-02-25 07:44:33.807610316 +0000 UTC m=+1627.169436326" watchObservedRunningTime="2026-02-25 07:44:33.813582972 +0000 UTC m=+1627.175408992" Feb 25 07:44:39 crc kubenswrapper[4749]: I0225 07:44:39.162779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:39 crc kubenswrapper[4749]: I0225 07:44:39.163395 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:39 crc kubenswrapper[4749]: I0225 07:44:39.226546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:39 crc kubenswrapper[4749]: I0225 07:44:39.903429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:39 crc kubenswrapper[4749]: I0225 07:44:39.965566 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:41 crc kubenswrapper[4749]: I0225 07:44:41.854036 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnbwx" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="registry-server" containerID="cri-o://43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d" gracePeriod=2 Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.345910 4749 scope.go:117] "RemoveContainer" containerID="74908daed5d334da53f77583af35385d537b66a1f7b29501a945650e80bc0195" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.348013 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.468543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities\") pod \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.468665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content\") pod \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.468722 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8zt5\" (UniqueName: \"kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5\") pod \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\" (UID: \"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860\") " Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.469874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities" (OuterVolumeSpecName: "utilities") pod "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" (UID: "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.480920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5" (OuterVolumeSpecName: "kube-api-access-l8zt5") pod "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" (UID: "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860"). InnerVolumeSpecName "kube-api-access-l8zt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.493714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" (UID: "5b05b1f2-9cff-4d6d-b03e-3f175cdb0860"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.570259 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.570295 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.570310 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8zt5\" (UniqueName: \"kubernetes.io/projected/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860-kube-api-access-l8zt5\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.871579 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerID="43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d" exitCode=0 Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.871690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerDied","Data":"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d"} Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.871712 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnbwx" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.871749 4749 scope.go:117] "RemoveContainer" containerID="43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.871731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnbwx" event={"ID":"5b05b1f2-9cff-4d6d-b03e-3f175cdb0860","Type":"ContainerDied","Data":"07970082c187ef76092d508266b8ce3db7bc93cec15d464bc011203153537969"} Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.911761 4749 scope.go:117] "RemoveContainer" containerID="280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.952618 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.965953 4749 scope.go:117] "RemoveContainer" containerID="d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9" Feb 25 07:44:42 crc kubenswrapper[4749]: I0225 07:44:42.969578 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnbwx"] Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.042757 4749 scope.go:117] "RemoveContainer" containerID="43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d" Feb 25 07:44:43 crc kubenswrapper[4749]: E0225 07:44:43.043682 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d\": container with ID starting with 43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d not found: ID does not exist" containerID="43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.043751 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d"} err="failed to get container status \"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d\": rpc error: code = NotFound desc = could not find container \"43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d\": container with ID starting with 43e200e864ab21360300abe8d78161dc78c0e7322a27825b68079659035a872d not found: ID does not exist" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.043802 4749 scope.go:117] "RemoveContainer" containerID="280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c" Feb 25 07:44:43 crc kubenswrapper[4749]: E0225 07:44:43.044395 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c\": container with ID starting with 280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c not found: ID does not exist" containerID="280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.044442 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c"} err="failed to get container status \"280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c\": rpc error: code = NotFound desc = could not find container \"280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c\": container with ID starting with 280361c94c16bb59a33a1c75b22595e23db78ee2c557ea1ad6e1c71f70c3de6c not found: ID does not exist" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.044473 4749 scope.go:117] "RemoveContainer" containerID="d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9" Feb 25 07:44:43 crc kubenswrapper[4749]: E0225 07:44:43.045020 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9\": container with ID starting with d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9 not found: ID does not exist" containerID="d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.045070 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9"} err="failed to get container status \"d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9\": rpc error: code = NotFound desc = could not find container \"d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9\": container with ID starting with d2f6cefe67509387bdfd7abc377c89c28d4200660ab987430660482bf607b6a9 not found: ID does not exist" Feb 25 07:44:43 crc kubenswrapper[4749]: I0225 07:44:43.337370 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" path="/var/lib/kubelet/pods/5b05b1f2-9cff-4d6d-b03e-3f175cdb0860/volumes" Feb 25 07:44:46 crc kubenswrapper[4749]: I0225 07:44:46.922315 4749 generic.go:334] "Generic (PLEG): container finished" podID="42b06635-d369-4299-92f6-e912f4d811df" containerID="ad126335bb4abf1a90ec4ed16c209ac6d778fc5ce40b9da4c7382d586a575b0c" exitCode=0 Feb 25 07:44:46 crc kubenswrapper[4749]: I0225 07:44:46.922378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" event={"ID":"42b06635-d369-4299-92f6-e912f4d811df","Type":"ContainerDied","Data":"ad126335bb4abf1a90ec4ed16c209ac6d778fc5ce40b9da4c7382d586a575b0c"} Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.380901 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.506725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle\") pod \"42b06635-d369-4299-92f6-e912f4d811df\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.506794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam\") pod \"42b06635-d369-4299-92f6-e912f4d811df\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.506938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6wbf\" (UniqueName: \"kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf\") pod \"42b06635-d369-4299-92f6-e912f4d811df\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.506953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory\") pod \"42b06635-d369-4299-92f6-e912f4d811df\" (UID: \"42b06635-d369-4299-92f6-e912f4d811df\") " Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.515877 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "42b06635-d369-4299-92f6-e912f4d811df" (UID: "42b06635-d369-4299-92f6-e912f4d811df"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.516473 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf" (OuterVolumeSpecName: "kube-api-access-b6wbf") pod "42b06635-d369-4299-92f6-e912f4d811df" (UID: "42b06635-d369-4299-92f6-e912f4d811df"). InnerVolumeSpecName "kube-api-access-b6wbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.535558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "42b06635-d369-4299-92f6-e912f4d811df" (UID: "42b06635-d369-4299-92f6-e912f4d811df"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.542767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory" (OuterVolumeSpecName: "inventory") pod "42b06635-d369-4299-92f6-e912f4d811df" (UID: "42b06635-d369-4299-92f6-e912f4d811df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.609967 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.610377 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.610400 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6wbf\" (UniqueName: \"kubernetes.io/projected/42b06635-d369-4299-92f6-e912f4d811df-kube-api-access-b6wbf\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.610418 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42b06635-d369-4299-92f6-e912f4d811df-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.950008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" event={"ID":"42b06635-d369-4299-92f6-e912f4d811df","Type":"ContainerDied","Data":"258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd"} Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.950064 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258ccc7b28eebedd3e9f6e34e939f84352ca7f9b7d379f84c52f0ad0203999cd" Feb 25 07:44:48 crc kubenswrapper[4749]: I0225 07:44:48.950147 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.079220 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc"] Feb 25 07:44:49 crc kubenswrapper[4749]: E0225 07:44:49.079939 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b06635-d369-4299-92f6-e912f4d811df" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.079969 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b06635-d369-4299-92f6-e912f4d811df" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 07:44:49 crc kubenswrapper[4749]: E0225 07:44:49.079990 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="extract-content" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.080001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="extract-content" Feb 25 07:44:49 crc kubenswrapper[4749]: E0225 07:44:49.080035 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="registry-server" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.080046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="registry-server" Feb 25 07:44:49 crc kubenswrapper[4749]: E0225 07:44:49.080081 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="extract-utilities" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.080093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="extract-utilities" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.080389 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b06635-d369-4299-92f6-e912f4d811df" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.080440 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b05b1f2-9cff-4d6d-b03e-3f175cdb0860" containerName="registry-server" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.081392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.091041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc"] Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.119720 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.120013 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.120199 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.120426 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.130434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.130567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmkc\" (UniqueName: \"kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.130641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.232163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.232268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrmkc\" (UniqueName: \"kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.232293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.237617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.238352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.250486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrmkc\" (UniqueName: \"kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:49 crc kubenswrapper[4749]: I0225 07:44:49.448087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:44:50 crc kubenswrapper[4749]: I0225 07:44:50.012565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc"] Feb 25 07:44:50 crc kubenswrapper[4749]: I0225 07:44:50.974301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" event={"ID":"fd2d74c2-5270-4697-b4fb-47a5affbbf68","Type":"ContainerStarted","Data":"8fb899a7a42d5a0da77449ec7423b1b84516e556a39429db17d61a6a9679cb8b"} Feb 25 07:44:50 crc kubenswrapper[4749]: I0225 07:44:50.974668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" event={"ID":"fd2d74c2-5270-4697-b4fb-47a5affbbf68","Type":"ContainerStarted","Data":"219b92c53f276cb8f41b3d140571e9f9c8111c718828edb811e144451e80bb36"} Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.003954 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" podStartSLOduration=1.481687303 podStartE2EDuration="2.003931179s" podCreationTimestamp="2026-02-25 07:44:49 +0000 UTC" firstStartedPulling="2026-02-25 07:44:50.015144736 +0000 UTC m=+1643.376970746" lastFinishedPulling="2026-02-25 07:44:50.537388582 +0000 UTC m=+1643.899214622" observedRunningTime="2026-02-25 07:44:50.996736924 +0000 UTC m=+1644.358562954" watchObservedRunningTime="2026-02-25 07:44:51.003931179 +0000 UTC m=+1644.365757209" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.671729 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.671781 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.671818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.672792 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.672904 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" gracePeriod=600 Feb 25 07:44:51 crc kubenswrapper[4749]: E0225 07:44:51.793840 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.988676 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" exitCode=0 Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.988862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28"} Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.988959 4749 scope.go:117] "RemoveContainer" containerID="befee2ed55dd01e1b8f48a8805056fd97568cc138fc34609da87cbd56a66bd8b" Feb 25 07:44:51 crc kubenswrapper[4749]: I0225 07:44:51.989942 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:44:51 crc kubenswrapper[4749]: E0225 07:44:51.990703 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.137353 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r"] Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.139152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.141687 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.142733 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.154918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r"] Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.252197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.252477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxj2m\" (UniqueName: \"kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.252636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.354438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.354473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxj2m\" (UniqueName: \"kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.354530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.355346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.362622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.376200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxj2m\" (UniqueName: \"kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m\") pod \"collect-profiles-29533425-bw74r\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.476829 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:00 crc kubenswrapper[4749]: I0225 07:45:00.960955 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r"] Feb 25 07:45:01 crc kubenswrapper[4749]: I0225 07:45:01.115699 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" event={"ID":"81f99f78-940b-4c5e-b3f8-5d43fdc94340","Type":"ContainerStarted","Data":"216db77e60c4fd0743b4ab33b1a95eabadd55c3318e174f88c68b5a5f5772baf"} Feb 25 07:45:02 crc kubenswrapper[4749]: I0225 07:45:02.126243 4749 generic.go:334] "Generic (PLEG): container finished" podID="81f99f78-940b-4c5e-b3f8-5d43fdc94340" containerID="371a26222bf108fc5ec0f7373d3dc72b50284939bd04341520608005b5fd6743" exitCode=0 Feb 25 07:45:02 crc kubenswrapper[4749]: I0225 07:45:02.126325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" event={"ID":"81f99f78-940b-4c5e-b3f8-5d43fdc94340","Type":"ContainerDied","Data":"371a26222bf108fc5ec0f7373d3dc72b50284939bd04341520608005b5fd6743"} Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.535294 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.643138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume\") pod \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.643234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume\") pod \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.643346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxj2m\" (UniqueName: \"kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m\") pod \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\" (UID: \"81f99f78-940b-4c5e-b3f8-5d43fdc94340\") " Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.644876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume" (OuterVolumeSpecName: "config-volume") pod "81f99f78-940b-4c5e-b3f8-5d43fdc94340" (UID: "81f99f78-940b-4c5e-b3f8-5d43fdc94340"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.650489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m" (OuterVolumeSpecName: "kube-api-access-zxj2m") pod "81f99f78-940b-4c5e-b3f8-5d43fdc94340" (UID: "81f99f78-940b-4c5e-b3f8-5d43fdc94340"). InnerVolumeSpecName "kube-api-access-zxj2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.650941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81f99f78-940b-4c5e-b3f8-5d43fdc94340" (UID: "81f99f78-940b-4c5e-b3f8-5d43fdc94340"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.745251 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81f99f78-940b-4c5e-b3f8-5d43fdc94340-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.745285 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81f99f78-940b-4c5e-b3f8-5d43fdc94340-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 07:45:03 crc kubenswrapper[4749]: I0225 07:45:03.745296 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxj2m\" (UniqueName: \"kubernetes.io/projected/81f99f78-940b-4c5e-b3f8-5d43fdc94340-kube-api-access-zxj2m\") on node \"crc\" DevicePath \"\"" Feb 25 07:45:04 crc kubenswrapper[4749]: I0225 07:45:04.152582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" event={"ID":"81f99f78-940b-4c5e-b3f8-5d43fdc94340","Type":"ContainerDied","Data":"216db77e60c4fd0743b4ab33b1a95eabadd55c3318e174f88c68b5a5f5772baf"} Feb 25 07:45:04 crc kubenswrapper[4749]: I0225 07:45:04.152720 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="216db77e60c4fd0743b4ab33b1a95eabadd55c3318e174f88c68b5a5f5772baf" Feb 25 07:45:04 crc kubenswrapper[4749]: I0225 07:45:04.152672 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r" Feb 25 07:45:04 crc kubenswrapper[4749]: I0225 07:45:04.323582 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:45:04 crc kubenswrapper[4749]: E0225 07:45:04.324089 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:45:19 crc kubenswrapper[4749]: I0225 07:45:19.323313 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:45:19 crc kubenswrapper[4749]: E0225 07:45:19.324545 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:45:33 crc kubenswrapper[4749]: I0225 07:45:33.323125 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:45:33 crc kubenswrapper[4749]: E0225 07:45:33.323941 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:45:42 crc kubenswrapper[4749]: I0225 07:45:42.447113 4749 scope.go:117] "RemoveContainer" containerID="73251f4d1d186f22486ad55e47907bae7d9f70049407db8c086b5b7e446dcd4f" Feb 25 07:45:42 crc kubenswrapper[4749]: I0225 07:45:42.492434 4749 scope.go:117] "RemoveContainer" containerID="cdf7247bc56f70baa999ff5bef0403ad8f1ae80ffe68f294f531558cd8b11eca" Feb 25 07:45:42 crc kubenswrapper[4749]: I0225 07:45:42.529854 4749 scope.go:117] "RemoveContainer" containerID="3062fe4ac8bc383f5e95dfc646935dd4a31508dceb833a33cad1010a456cc631" Feb 25 07:45:42 crc kubenswrapper[4749]: I0225 07:45:42.605947 4749 scope.go:117] "RemoveContainer" containerID="683ccc7b1eac6c80b6e934d381b28fea247e79059c913f484d4efca516045894" Feb 25 07:45:48 crc kubenswrapper[4749]: I0225 07:45:48.322363 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:45:48 crc kubenswrapper[4749]: E0225 07:45:48.323232 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.094082 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6259-account-create-update-5zpf8"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.101929 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mqghp"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.109258 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6259-account-create-update-5zpf8"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.117477 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5fnnk"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.124854 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mqghp"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.133811 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-39fa-account-create-update-2sll4"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.144124 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5fnnk"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.152554 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-39fa-account-create-update-2sll4"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.161030 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533426-5xgxf"] Feb 25 07:46:00 crc kubenswrapper[4749]: E0225 07:46:00.161624 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f99f78-940b-4c5e-b3f8-5d43fdc94340" containerName="collect-profiles" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.161649 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f99f78-940b-4c5e-b3f8-5d43fdc94340" containerName="collect-profiles" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.161911 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f99f78-940b-4c5e-b3f8-5d43fdc94340" containerName="collect-profiles" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.162842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.164770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.165713 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.166891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.184356 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533426-5xgxf"] Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.240365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt7mj\" (UniqueName: \"kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj\") pod \"auto-csr-approver-29533426-5xgxf\" (UID: \"a389a4a6-b001-424f-af2d-058cc85fc419\") " pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.323294 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:46:00 crc kubenswrapper[4749]: E0225 07:46:00.323795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.342500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt7mj\" (UniqueName: \"kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj\") pod \"auto-csr-approver-29533426-5xgxf\" (UID: \"a389a4a6-b001-424f-af2d-058cc85fc419\") " pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.371497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt7mj\" (UniqueName: \"kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj\") pod \"auto-csr-approver-29533426-5xgxf\" (UID: \"a389a4a6-b001-424f-af2d-058cc85fc419\") " pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.483440 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:00 crc kubenswrapper[4749]: I0225 07:46:00.958254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533426-5xgxf"] Feb 25 07:46:00 crc kubenswrapper[4749]: W0225 07:46:00.964260 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda389a4a6_b001_424f_af2d_058cc85fc419.slice/crio-0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac WatchSource:0}: Error finding container 0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac: Status 404 returned error can't find the container with id 0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac Feb 25 07:46:01 crc kubenswrapper[4749]: I0225 07:46:01.341324 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122084b5-9e7b-4874-86db-10ae68b0c801" path="/var/lib/kubelet/pods/122084b5-9e7b-4874-86db-10ae68b0c801/volumes" Feb 25 07:46:01 crc kubenswrapper[4749]: I0225 07:46:01.342478 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ec8fbf-7999-4ba2-a4a0-cf742f7317dc" path="/var/lib/kubelet/pods/29ec8fbf-7999-4ba2-a4a0-cf742f7317dc/volumes" Feb 25 07:46:01 crc kubenswrapper[4749]: I0225 07:46:01.343665 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d" path="/var/lib/kubelet/pods/3c748c65-3ae3-48d3-aacf-d4fd2dc4fd2d/volumes" Feb 25 07:46:01 crc kubenswrapper[4749]: I0225 07:46:01.344886 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625ec738-a35a-4a37-ab15-63334e614c88" path="/var/lib/kubelet/pods/625ec738-a35a-4a37-ab15-63334e614c88/volumes" Feb 25 07:46:01 crc kubenswrapper[4749]: I0225 07:46:01.828301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" event={"ID":"a389a4a6-b001-424f-af2d-058cc85fc419","Type":"ContainerStarted","Data":"0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac"} Feb 25 07:46:02 crc kubenswrapper[4749]: I0225 07:46:02.842526 4749 generic.go:334] "Generic (PLEG): container finished" podID="a389a4a6-b001-424f-af2d-058cc85fc419" containerID="5693fb493b6bc8d837acdc868c7dce21222add1de538a42d69aa0445c7ca46dc" exitCode=0 Feb 25 07:46:02 crc kubenswrapper[4749]: I0225 07:46:02.842656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" event={"ID":"a389a4a6-b001-424f-af2d-058cc85fc419","Type":"ContainerDied","Data":"5693fb493b6bc8d837acdc868c7dce21222add1de538a42d69aa0445c7ca46dc"} Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.063137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ca9f-account-create-update-5kb9h"] Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.072727 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-48962"] Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.083010 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ca9f-account-create-update-5kb9h"] Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.093852 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-48962"] Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.292122 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.427930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt7mj\" (UniqueName: \"kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj\") pod \"a389a4a6-b001-424f-af2d-058cc85fc419\" (UID: \"a389a4a6-b001-424f-af2d-058cc85fc419\") " Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.433766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj" (OuterVolumeSpecName: "kube-api-access-dt7mj") pod "a389a4a6-b001-424f-af2d-058cc85fc419" (UID: "a389a4a6-b001-424f-af2d-058cc85fc419"). InnerVolumeSpecName "kube-api-access-dt7mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.531050 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt7mj\" (UniqueName: \"kubernetes.io/projected/a389a4a6-b001-424f-af2d-058cc85fc419-kube-api-access-dt7mj\") on node \"crc\" DevicePath \"\"" Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.870973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" event={"ID":"a389a4a6-b001-424f-af2d-058cc85fc419","Type":"ContainerDied","Data":"0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac"} Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.871033 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5513e0be733497991f83e7ff88588fe6e75be020314e5bd5979d9d14ebd1ac" Feb 25 07:46:04 crc kubenswrapper[4749]: I0225 07:46:04.871131 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533426-5xgxf" Feb 25 07:46:05 crc kubenswrapper[4749]: I0225 07:46:05.337579 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb70cdb-8655-4c36-8c0c-8bde033488ad" path="/var/lib/kubelet/pods/cfb70cdb-8655-4c36-8c0c-8bde033488ad/volumes" Feb 25 07:46:05 crc kubenswrapper[4749]: I0225 07:46:05.338164 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de31f747-1a7d-4950-93b5-88938e84e33e" path="/var/lib/kubelet/pods/de31f747-1a7d-4950-93b5-88938e84e33e/volumes" Feb 25 07:46:05 crc kubenswrapper[4749]: I0225 07:46:05.360301 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533420-ktflf"] Feb 25 07:46:05 crc kubenswrapper[4749]: I0225 07:46:05.367866 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533420-ktflf"] Feb 25 07:46:07 crc kubenswrapper[4749]: I0225 07:46:07.340282 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0" path="/var/lib/kubelet/pods/bcdedcd9-c76c-4c9a-b6a3-917d36cc63c0/volumes" Feb 25 07:46:11 crc kubenswrapper[4749]: I0225 07:46:11.322974 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:46:11 crc kubenswrapper[4749]: E0225 07:46:11.323838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:46:16 crc kubenswrapper[4749]: I0225 07:46:16.002769 4749 generic.go:334] "Generic (PLEG): container finished" podID="fd2d74c2-5270-4697-b4fb-47a5affbbf68" containerID="8fb899a7a42d5a0da77449ec7423b1b84516e556a39429db17d61a6a9679cb8b" exitCode=0 Feb 25 07:46:16 crc kubenswrapper[4749]: I0225 07:46:16.002879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" event={"ID":"fd2d74c2-5270-4697-b4fb-47a5affbbf68","Type":"ContainerDied","Data":"8fb899a7a42d5a0da77449ec7423b1b84516e556a39429db17d61a6a9679cb8b"} Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.493713 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.647158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam\") pod \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.647440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory\") pod \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.647520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrmkc\" (UniqueName: \"kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc\") pod \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\" (UID: \"fd2d74c2-5270-4697-b4fb-47a5affbbf68\") " Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.656575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc" (OuterVolumeSpecName: "kube-api-access-wrmkc") pod "fd2d74c2-5270-4697-b4fb-47a5affbbf68" (UID: "fd2d74c2-5270-4697-b4fb-47a5affbbf68"). InnerVolumeSpecName "kube-api-access-wrmkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.698416 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fd2d74c2-5270-4697-b4fb-47a5affbbf68" (UID: "fd2d74c2-5270-4697-b4fb-47a5affbbf68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.700347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory" (OuterVolumeSpecName: "inventory") pod "fd2d74c2-5270-4697-b4fb-47a5affbbf68" (UID: "fd2d74c2-5270-4697-b4fb-47a5affbbf68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.749958 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.750016 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrmkc\" (UniqueName: \"kubernetes.io/projected/fd2d74c2-5270-4697-b4fb-47a5affbbf68-kube-api-access-wrmkc\") on node \"crc\" DevicePath \"\"" Feb 25 07:46:17 crc kubenswrapper[4749]: I0225 07:46:17.750050 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd2d74c2-5270-4697-b4fb-47a5affbbf68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.028406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" event={"ID":"fd2d74c2-5270-4697-b4fb-47a5affbbf68","Type":"ContainerDied","Data":"219b92c53f276cb8f41b3d140571e9f9c8111c718828edb811e144451e80bb36"} Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.028465 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="219b92c53f276cb8f41b3d140571e9f9c8111c718828edb811e144451e80bb36" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.028566 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.169215 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng"] Feb 25 07:46:18 crc kubenswrapper[4749]: E0225 07:46:18.169799 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2d74c2-5270-4697-b4fb-47a5affbbf68" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.169821 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2d74c2-5270-4697-b4fb-47a5affbbf68" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 07:46:18 crc kubenswrapper[4749]: E0225 07:46:18.169840 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a389a4a6-b001-424f-af2d-058cc85fc419" containerName="oc" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.169851 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a389a4a6-b001-424f-af2d-058cc85fc419" containerName="oc" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.170088 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a389a4a6-b001-424f-af2d-058cc85fc419" containerName="oc" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.170122 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2d74c2-5270-4697-b4fb-47a5affbbf68" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.170922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.173903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.174670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.175142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.175313 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.182274 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng"] Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.261517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bwm\" (UniqueName: \"kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.262002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.262100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.364101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.364203 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.364345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bwm\" (UniqueName: \"kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.367987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.370255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.379631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bwm\" (UniqueName: \"kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vfqng\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:18 crc kubenswrapper[4749]: I0225 07:46:18.536098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:46:19 crc kubenswrapper[4749]: I0225 07:46:19.127329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng"] Feb 25 07:46:20 crc kubenswrapper[4749]: I0225 07:46:20.053929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" event={"ID":"f2f01883-686b-4aed-9458-ee14d1c3eb10","Type":"ContainerStarted","Data":"22157a196dad4ae1628f70ced2aae8c2cb5b01f28a2f3388977c6c351d72338a"} Feb 25 07:46:20 crc kubenswrapper[4749]: I0225 07:46:20.054017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" event={"ID":"f2f01883-686b-4aed-9458-ee14d1c3eb10","Type":"ContainerStarted","Data":"d9a49a925e7a810a3a948cec8ea72ad9324536faa96bc8baebdf0a7b588aeb25"} Feb 25 07:46:20 crc kubenswrapper[4749]: I0225 07:46:20.082317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" podStartSLOduration=1.54098713 podStartE2EDuration="2.082291189s" podCreationTimestamp="2026-02-25 07:46:18 +0000 UTC" firstStartedPulling="2026-02-25 07:46:19.137574194 +0000 UTC m=+1732.499400214" lastFinishedPulling="2026-02-25 07:46:19.678878213 +0000 UTC m=+1733.040704273" observedRunningTime="2026-02-25 07:46:20.078701453 +0000 UTC m=+1733.440527523" watchObservedRunningTime="2026-02-25 07:46:20.082291189 +0000 UTC m=+1733.444117209" Feb 25 07:46:23 crc kubenswrapper[4749]: I0225 07:46:23.323163 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:46:23 crc kubenswrapper[4749]: E0225 07:46:23.324366 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:46:29 crc kubenswrapper[4749]: I0225 07:46:29.068644 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6lvxb"] Feb 25 07:46:29 crc kubenswrapper[4749]: I0225 07:46:29.081401 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6lvxb"] Feb 25 07:46:29 crc kubenswrapper[4749]: I0225 07:46:29.343353 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0991404c-02d3-451e-9a9a-fbd93370e965" path="/var/lib/kubelet/pods/0991404c-02d3-451e-9a9a-fbd93370e965/volumes" Feb 25 07:46:38 crc kubenswrapper[4749]: I0225 07:46:38.322646 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:46:38 crc kubenswrapper[4749]: E0225 07:46:38.323847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.062556 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wwlbd"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.082246 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rdz5t"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.099833 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-634a-account-create-update-zm2md"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.111574 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rdz5t"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.122874 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wwlbd"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.133158 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5mn2f"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.143859 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-634a-account-create-update-zm2md"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.155755 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b75e-account-create-update-ln7b8"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.166263 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8318-account-create-update-jmlls"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.176126 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b75e-account-create-update-ln7b8"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.183689 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5mn2f"] Feb 25 07:46:40 crc kubenswrapper[4749]: I0225 07:46:40.191235 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8318-account-create-update-jmlls"] Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.341138 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29504d02-065f-4682-aaab-4b0057c0a3e5" path="/var/lib/kubelet/pods/29504d02-065f-4682-aaab-4b0057c0a3e5/volumes" Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.342546 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5421bf98-9a20-489d-95b6-e96102c963da" path="/var/lib/kubelet/pods/5421bf98-9a20-489d-95b6-e96102c963da/volumes" Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.343848 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10" path="/var/lib/kubelet/pods/852f5f2e-07e1-4fcf-8c5d-fb17ea3c8a10/volumes" Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.345200 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45bc1da-cff5-4bb0-884e-439a3cc64f38" path="/var/lib/kubelet/pods/a45bc1da-cff5-4bb0-884e-439a3cc64f38/volumes" Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.347658 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbb415a-14d1-48e4-8baf-de714957de54" path="/var/lib/kubelet/pods/adbb415a-14d1-48e4-8baf-de714957de54/volumes" Feb 25 07:46:41 crc kubenswrapper[4749]: I0225 07:46:41.348961 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a33493-4015-4e30-b004-3e3e501bef55" path="/var/lib/kubelet/pods/e6a33493-4015-4e30-b004-3e3e501bef55/volumes" Feb 25 07:46:42 crc kubenswrapper[4749]: I0225 07:46:42.707196 4749 scope.go:117] "RemoveContainer" containerID="eb455f7390988a1bcd4d5bbfe34f53f6489a85308d1f76b5ec915caf9b9e456a" Feb 25 07:46:42 crc kubenswrapper[4749]: I0225 07:46:42.768778 4749 scope.go:117] "RemoveContainer" containerID="43b66c1f3b7fd34bedaca82526c00987fa78506682b2ef846fb59b6fd6afd428" Feb 25 07:46:42 crc kubenswrapper[4749]: I0225 07:46:42.863908 4749 scope.go:117] "RemoveContainer" containerID="a9ad0337906e9bd2757ef085f0fe39f17c49743a19bd35f1e6f151c465337a64" Feb 25 07:46:42 crc kubenswrapper[4749]: I0225 07:46:42.907199 4749 scope.go:117] "RemoveContainer" containerID="81cef26eca6a22bcd0bfee9374f35ac974f3ff6768a118d779ddfb7a075e8875" Feb 25 07:46:42 crc kubenswrapper[4749]: I0225 07:46:42.968978 4749 scope.go:117] "RemoveContainer" containerID="5e07cf856dbe11b31d5c2fc33679ebeb80a608397f46b96ee63e5cb128881d09" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.028458 4749 scope.go:117] "RemoveContainer" containerID="877034bdebe9af498641ea1384a3d14c6780bbeba64f9119e1323a225b912fdd" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.035388 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-klc58"] Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.050666 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-klc58"] Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.059699 4749 scope.go:117] "RemoveContainer" containerID="93c2d0bdb0f83d87cdf0667f833c88bd248b781592b05af3b547451c87a78998" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.078653 4749 scope.go:117] "RemoveContainer" containerID="e6c0eee1f85c6edcdea8c9c6fd9ffc79f0186e1dcca9e7e603dbebcadb03e580" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.097532 4749 scope.go:117] "RemoveContainer" containerID="693e20096bb732c62ffb9d0b2bdfd8f9e6deee6e70a9b215dfa5163ac7abadae" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.133945 4749 scope.go:117] "RemoveContainer" containerID="98d3c7179d19382db3c6a0297499b7639ea64490e3ca51747ed87508c431a4d7" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.152821 4749 scope.go:117] "RemoveContainer" containerID="ec22a47aecb1976c825d82270f902e7da78446566dcbe2062f91a67cb37acfee" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.174296 4749 scope.go:117] "RemoveContainer" containerID="a3521c34d5ad33c1c4c8a5d9b95b563c4893620005cc23ac2b747d35d875aa71" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.196361 4749 scope.go:117] "RemoveContainer" containerID="3310a2a268d1fb19f153267b0dec4884f57c85e5ce309c5c1e88e823a73dc51a" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.239172 4749 scope.go:117] "RemoveContainer" containerID="9d0ad1a599539c7513fccb1cd55f646673985ea392c8409d1cd0dd7980e2b67d" Feb 25 07:46:43 crc kubenswrapper[4749]: I0225 07:46:43.340993 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe32b6c-b26a-4960-9b88-db584d3c56bf" path="/var/lib/kubelet/pods/bbe32b6c-b26a-4960-9b88-db584d3c56bf/volumes" Feb 25 07:46:45 crc kubenswrapper[4749]: I0225 07:46:45.048961 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rdvkc"] Feb 25 07:46:45 crc kubenswrapper[4749]: I0225 07:46:45.067906 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rdvkc"] Feb 25 07:46:45 crc kubenswrapper[4749]: I0225 07:46:45.341292 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f34460-f59e-4a71-82f0-4486398bd903" path="/var/lib/kubelet/pods/e3f34460-f59e-4a71-82f0-4486398bd903/volumes" Feb 25 07:46:52 crc kubenswrapper[4749]: I0225 07:46:52.322405 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:46:52 crc kubenswrapper[4749]: E0225 07:46:52.323167 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:47:05 crc kubenswrapper[4749]: I0225 07:47:05.323184 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:47:05 crc kubenswrapper[4749]: E0225 07:47:05.324354 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:47:16 crc kubenswrapper[4749]: I0225 07:47:16.057473 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-lv8wj"] Feb 25 07:47:16 crc kubenswrapper[4749]: I0225 07:47:16.072108 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-lv8wj"] Feb 25 07:47:16 crc kubenswrapper[4749]: I0225 07:47:16.322473 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:47:16 crc kubenswrapper[4749]: E0225 07:47:16.322953 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:47:17 crc kubenswrapper[4749]: I0225 07:47:17.340485 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857df7ae-9c0e-47b6-9248-2981d1b1d796" path="/var/lib/kubelet/pods/857df7ae-9c0e-47b6-9248-2981d1b1d796/volumes" Feb 25 07:47:22 crc kubenswrapper[4749]: I0225 07:47:22.086569 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xld25"] Feb 25 07:47:22 crc kubenswrapper[4749]: I0225 07:47:22.103274 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hv26f"] Feb 25 07:47:22 crc kubenswrapper[4749]: I0225 07:47:22.114464 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hv26f"] Feb 25 07:47:22 crc kubenswrapper[4749]: I0225 07:47:22.124128 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xld25"] Feb 25 07:47:23 crc kubenswrapper[4749]: I0225 07:47:23.337924 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd296d6-e025-45bc-9149-d6595e7e683a" path="/var/lib/kubelet/pods/9dd296d6-e025-45bc-9149-d6595e7e683a/volumes" Feb 25 07:47:23 crc kubenswrapper[4749]: I0225 07:47:23.338752 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff" path="/var/lib/kubelet/pods/f4c4158a-d3b0-46c5-8dfc-fb628a3e8aff/volumes" Feb 25 07:47:25 crc kubenswrapper[4749]: I0225 07:47:25.058295 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kmwww"] Feb 25 07:47:25 crc kubenswrapper[4749]: I0225 07:47:25.072827 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kmwww"] Feb 25 07:47:25 crc kubenswrapper[4749]: I0225 07:47:25.353655 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18514a86-67f9-4c66-b5bb-59b4a2f34627" path="/var/lib/kubelet/pods/18514a86-67f9-4c66-b5bb-59b4a2f34627/volumes" Feb 25 07:47:26 crc kubenswrapper[4749]: I0225 07:47:26.902133 4749 generic.go:334] "Generic (PLEG): container finished" podID="f2f01883-686b-4aed-9458-ee14d1c3eb10" containerID="22157a196dad4ae1628f70ced2aae8c2cb5b01f28a2f3388977c6c351d72338a" exitCode=0 Feb 25 07:47:26 crc kubenswrapper[4749]: I0225 07:47:26.902211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" event={"ID":"f2f01883-686b-4aed-9458-ee14d1c3eb10","Type":"ContainerDied","Data":"22157a196dad4ae1628f70ced2aae8c2cb5b01f28a2f3388977c6c351d72338a"} Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.395461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.594358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam\") pod \"f2f01883-686b-4aed-9458-ee14d1c3eb10\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.594712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7bwm\" (UniqueName: \"kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm\") pod \"f2f01883-686b-4aed-9458-ee14d1c3eb10\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.594885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory\") pod \"f2f01883-686b-4aed-9458-ee14d1c3eb10\" (UID: \"f2f01883-686b-4aed-9458-ee14d1c3eb10\") " Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.600686 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm" (OuterVolumeSpecName: "kube-api-access-k7bwm") pod "f2f01883-686b-4aed-9458-ee14d1c3eb10" (UID: "f2f01883-686b-4aed-9458-ee14d1c3eb10"). InnerVolumeSpecName "kube-api-access-k7bwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.621493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory" (OuterVolumeSpecName: "inventory") pod "f2f01883-686b-4aed-9458-ee14d1c3eb10" (UID: "f2f01883-686b-4aed-9458-ee14d1c3eb10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.629968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f2f01883-686b-4aed-9458-ee14d1c3eb10" (UID: "f2f01883-686b-4aed-9458-ee14d1c3eb10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.698299 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.698383 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7bwm\" (UniqueName: \"kubernetes.io/projected/f2f01883-686b-4aed-9458-ee14d1c3eb10-kube-api-access-k7bwm\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.698411 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2f01883-686b-4aed-9458-ee14d1c3eb10-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.929492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" event={"ID":"f2f01883-686b-4aed-9458-ee14d1c3eb10","Type":"ContainerDied","Data":"d9a49a925e7a810a3a948cec8ea72ad9324536faa96bc8baebdf0a7b588aeb25"} Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.929543 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a49a925e7a810a3a948cec8ea72ad9324536faa96bc8baebdf0a7b588aeb25" Feb 25 07:47:28 crc kubenswrapper[4749]: I0225 07:47:28.929645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vfqng" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.035561 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m"] Feb 25 07:47:29 crc kubenswrapper[4749]: E0225 07:47:29.036389 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f01883-686b-4aed-9458-ee14d1c3eb10" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.036433 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f01883-686b-4aed-9458-ee14d1c3eb10" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.036811 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f01883-686b-4aed-9458-ee14d1c3eb10" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.037902 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.040384 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.041840 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.041961 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.042083 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.053190 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m"] Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.105127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbgp\" (UniqueName: \"kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.105198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.105247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.206673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbgp\" (UniqueName: \"kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.207136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.207365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.214624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.218752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.236682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbgp\" (UniqueName: \"kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-42z9m\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.322077 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:47:29 crc kubenswrapper[4749]: E0225 07:47:29.322699 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:47:29 crc kubenswrapper[4749]: I0225 07:47:29.362674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:30 crc kubenswrapper[4749]: I0225 07:47:30.002066 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:47:30 crc kubenswrapper[4749]: I0225 07:47:30.002569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m"] Feb 25 07:47:30 crc kubenswrapper[4749]: I0225 07:47:30.973855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" event={"ID":"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf","Type":"ContainerStarted","Data":"213bd4f63182ee9da6f7267e958bd585b2a6d4ef97f827cb20c3d03ec6e4155b"} Feb 25 07:47:30 crc kubenswrapper[4749]: I0225 07:47:30.974489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" event={"ID":"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf","Type":"ContainerStarted","Data":"c940d9bfbb487a67afbffd9d59520085d4db391bc55a0799da17bd9cb4d08a7c"} Feb 25 07:47:31 crc kubenswrapper[4749]: I0225 07:47:31.018178 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" podStartSLOduration=1.526396375 podStartE2EDuration="2.018152603s" podCreationTimestamp="2026-02-25 07:47:29 +0000 UTC" firstStartedPulling="2026-02-25 07:47:30.001784473 +0000 UTC m=+1803.363610503" lastFinishedPulling="2026-02-25 07:47:30.493540671 +0000 UTC m=+1803.855366731" observedRunningTime="2026-02-25 07:47:30.996364766 +0000 UTC m=+1804.358190826" watchObservedRunningTime="2026-02-25 07:47:31.018152603 +0000 UTC m=+1804.379978633" Feb 25 07:47:36 crc kubenswrapper[4749]: I0225 07:47:36.036778 4749 generic.go:334] "Generic (PLEG): container finished" podID="e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" containerID="213bd4f63182ee9da6f7267e958bd585b2a6d4ef97f827cb20c3d03ec6e4155b" exitCode=0 Feb 25 07:47:36 crc kubenswrapper[4749]: I0225 07:47:36.037420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" event={"ID":"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf","Type":"ContainerDied","Data":"213bd4f63182ee9da6f7267e958bd585b2a6d4ef97f827cb20c3d03ec6e4155b"} Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.519879 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.702832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcbgp\" (UniqueName: \"kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp\") pod \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.702955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam\") pod \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.703004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory\") pod \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\" (UID: \"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf\") " Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.712376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp" (OuterVolumeSpecName: "kube-api-access-qcbgp") pod "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" (UID: "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf"). InnerVolumeSpecName "kube-api-access-qcbgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.752575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory" (OuterVolumeSpecName: "inventory") pod "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" (UID: "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.752807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" (UID: "e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.805543 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcbgp\" (UniqueName: \"kubernetes.io/projected/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-kube-api-access-qcbgp\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.805582 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:37 crc kubenswrapper[4749]: I0225 07:47:37.805613 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.044155 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lth6k"] Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.060161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" event={"ID":"e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf","Type":"ContainerDied","Data":"c940d9bfbb487a67afbffd9d59520085d4db391bc55a0799da17bd9cb4d08a7c"} Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.060215 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c940d9bfbb487a67afbffd9d59520085d4db391bc55a0799da17bd9cb4d08a7c" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.060241 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-42z9m" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.061033 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lth6k"] Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.162997 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc"] Feb 25 07:47:38 crc kubenswrapper[4749]: E0225 07:47:38.163634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.163665 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.164039 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.165095 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.167933 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.168077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.168221 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.168816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.178276 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc"] Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.319673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.319878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.319923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2lwr\" (UniqueName: \"kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.421437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.421740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2lwr\" (UniqueName: \"kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.421875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.425734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.426204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.442066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2lwr\" (UniqueName: \"kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-vjvzc\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:38 crc kubenswrapper[4749]: I0225 07:47:38.498239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:47:39 crc kubenswrapper[4749]: I0225 07:47:39.059450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc"] Feb 25 07:47:39 crc kubenswrapper[4749]: I0225 07:47:39.348313 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b" path="/var/lib/kubelet/pods/8486c1f4-0eef-47fd-bcf2-b1d87cfcd88b/volumes" Feb 25 07:47:40 crc kubenswrapper[4749]: I0225 07:47:40.085350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" event={"ID":"2c3eb600-4864-4229-bbfc-6b24211fc914","Type":"ContainerStarted","Data":"ff1ea5993d965c7833293a088b37e999f10a958ef094987bef3879cdf56cf44f"} Feb 25 07:47:40 crc kubenswrapper[4749]: I0225 07:47:40.085426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" event={"ID":"2c3eb600-4864-4229-bbfc-6b24211fc914","Type":"ContainerStarted","Data":"d55f0653d381d81b97147d025ffffc1c10c3fdec83b2e78e94e13851b4779278"} Feb 25 07:47:40 crc kubenswrapper[4749]: I0225 07:47:40.112061 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" podStartSLOduration=1.7115870100000001 podStartE2EDuration="2.112036857s" podCreationTimestamp="2026-02-25 07:47:38 +0000 UTC" firstStartedPulling="2026-02-25 07:47:39.071089832 +0000 UTC m=+1812.432915862" lastFinishedPulling="2026-02-25 07:47:39.471539659 +0000 UTC m=+1812.833365709" observedRunningTime="2026-02-25 07:47:40.110659664 +0000 UTC m=+1813.472485724" watchObservedRunningTime="2026-02-25 07:47:40.112036857 +0000 UTC m=+1813.473862907" Feb 25 07:47:41 crc kubenswrapper[4749]: I0225 07:47:41.322778 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:47:41 crc kubenswrapper[4749]: E0225 07:47:41.323302 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.559399 4749 scope.go:117] "RemoveContainer" containerID="f78259553aa77bb6c03a624be52623826a73c94da505b7aac080dfa9b2811465" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.605824 4749 scope.go:117] "RemoveContainer" containerID="5ebb1bd025f7a445de93cc0182ab12d6b9979e6d7b6788cf4ed4ba3ef12ce2b7" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.691916 4749 scope.go:117] "RemoveContainer" containerID="0b8f218137bc510a74de90b8782b57aa796b646e29b704b0a8270409fc1b6d3c" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.738373 4749 scope.go:117] "RemoveContainer" containerID="fd14d92db0185b6c0743ec6251b41f313737ee589de39c0a2b2a54c801c571dc" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.777241 4749 scope.go:117] "RemoveContainer" containerID="e86715d496cbb6b72a1e08498898cae5fc9ebe0a100509ac2df6c1b23c436922" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.826630 4749 scope.go:117] "RemoveContainer" containerID="202531abc1663b771c9da627e2e6781d517f7d87e3c1d51ca8c45f6102bdc54f" Feb 25 07:47:43 crc kubenswrapper[4749]: I0225 07:47:43.861008 4749 scope.go:117] "RemoveContainer" containerID="78b10e22e85da74a02f9a9893d84ab73c5ad25cbc3a9b93544da99c360822b50" Feb 25 07:47:53 crc kubenswrapper[4749]: I0225 07:47:53.322481 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:47:53 crc kubenswrapper[4749]: E0225 07:47:53.323371 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.145491 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533428-z7djp"] Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.147869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.154176 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.154815 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.154878 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.159378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533428-z7djp"] Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.184388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4n54\" (UniqueName: \"kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54\") pod \"auto-csr-approver-29533428-z7djp\" (UID: \"118325bc-53b6-454b-b62d-3eebfe528076\") " pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.286964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4n54\" (UniqueName: \"kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54\") pod \"auto-csr-approver-29533428-z7djp\" (UID: \"118325bc-53b6-454b-b62d-3eebfe528076\") " pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.314561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4n54\" (UniqueName: \"kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54\") pod \"auto-csr-approver-29533428-z7djp\" (UID: \"118325bc-53b6-454b-b62d-3eebfe528076\") " pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.473356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:00 crc kubenswrapper[4749]: W0225 07:48:00.958571 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod118325bc_53b6_454b_b62d_3eebfe528076.slice/crio-30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a WatchSource:0}: Error finding container 30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a: Status 404 returned error can't find the container with id 30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a Feb 25 07:48:00 crc kubenswrapper[4749]: I0225 07:48:00.969648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533428-z7djp"] Feb 25 07:48:01 crc kubenswrapper[4749]: I0225 07:48:01.304223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533428-z7djp" event={"ID":"118325bc-53b6-454b-b62d-3eebfe528076","Type":"ContainerStarted","Data":"30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a"} Feb 25 07:48:02 crc kubenswrapper[4749]: I0225 07:48:02.313494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533428-z7djp" event={"ID":"118325bc-53b6-454b-b62d-3eebfe528076","Type":"ContainerStarted","Data":"363e28066c93ed9fc9ef31a14f41d969455d80f95e4a678127dea21420af0cc8"} Feb 25 07:48:02 crc kubenswrapper[4749]: I0225 07:48:02.336270 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533428-z7djp" podStartSLOduration=1.466573948 podStartE2EDuration="2.336247196s" podCreationTimestamp="2026-02-25 07:48:00 +0000 UTC" firstStartedPulling="2026-02-25 07:48:00.962346598 +0000 UTC m=+1834.324172658" lastFinishedPulling="2026-02-25 07:48:01.832019876 +0000 UTC m=+1835.193845906" observedRunningTime="2026-02-25 07:48:02.327676608 +0000 UTC m=+1835.689502648" watchObservedRunningTime="2026-02-25 07:48:02.336247196 +0000 UTC m=+1835.698073226" Feb 25 07:48:03 crc kubenswrapper[4749]: I0225 07:48:03.324549 4749 generic.go:334] "Generic (PLEG): container finished" podID="118325bc-53b6-454b-b62d-3eebfe528076" containerID="363e28066c93ed9fc9ef31a14f41d969455d80f95e4a678127dea21420af0cc8" exitCode=0 Feb 25 07:48:03 crc kubenswrapper[4749]: I0225 07:48:03.342307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533428-z7djp" event={"ID":"118325bc-53b6-454b-b62d-3eebfe528076","Type":"ContainerDied","Data":"363e28066c93ed9fc9ef31a14f41d969455d80f95e4a678127dea21420af0cc8"} Feb 25 07:48:04 crc kubenswrapper[4749]: I0225 07:48:04.709640 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:04 crc kubenswrapper[4749]: I0225 07:48:04.780117 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4n54\" (UniqueName: \"kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54\") pod \"118325bc-53b6-454b-b62d-3eebfe528076\" (UID: \"118325bc-53b6-454b-b62d-3eebfe528076\") " Feb 25 07:48:04 crc kubenswrapper[4749]: I0225 07:48:04.789315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54" (OuterVolumeSpecName: "kube-api-access-v4n54") pod "118325bc-53b6-454b-b62d-3eebfe528076" (UID: "118325bc-53b6-454b-b62d-3eebfe528076"). InnerVolumeSpecName "kube-api-access-v4n54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:48:04 crc kubenswrapper[4749]: I0225 07:48:04.881798 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4n54\" (UniqueName: \"kubernetes.io/projected/118325bc-53b6-454b-b62d-3eebfe528076-kube-api-access-v4n54\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:05 crc kubenswrapper[4749]: I0225 07:48:05.352916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533428-z7djp" event={"ID":"118325bc-53b6-454b-b62d-3eebfe528076","Type":"ContainerDied","Data":"30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a"} Feb 25 07:48:05 crc kubenswrapper[4749]: I0225 07:48:05.353027 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30b1dbead9131845bb4d112d1cf2cc6e8efbbaba0a5d36fe03cc878c85e3220a" Feb 25 07:48:05 crc kubenswrapper[4749]: I0225 07:48:05.353121 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533428-z7djp" Feb 25 07:48:05 crc kubenswrapper[4749]: I0225 07:48:05.438487 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533422-qqsch"] Feb 25 07:48:05 crc kubenswrapper[4749]: I0225 07:48:05.448923 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533422-qqsch"] Feb 25 07:48:07 crc kubenswrapper[4749]: I0225 07:48:07.345047 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea2508a-7716-4440-b6c5-3acb308707ea" path="/var/lib/kubelet/pods/1ea2508a-7716-4440-b6c5-3acb308707ea/volumes" Feb 25 07:48:08 crc kubenswrapper[4749]: I0225 07:48:08.322478 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:48:08 crc kubenswrapper[4749]: E0225 07:48:08.323393 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:48:17 crc kubenswrapper[4749]: I0225 07:48:17.548335 4749 generic.go:334] "Generic (PLEG): container finished" podID="2c3eb600-4864-4229-bbfc-6b24211fc914" containerID="ff1ea5993d965c7833293a088b37e999f10a958ef094987bef3879cdf56cf44f" exitCode=0 Feb 25 07:48:17 crc kubenswrapper[4749]: I0225 07:48:17.548958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" event={"ID":"2c3eb600-4864-4229-bbfc-6b24211fc914","Type":"ContainerDied","Data":"ff1ea5993d965c7833293a088b37e999f10a958ef094987bef3879cdf56cf44f"} Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.035841 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xbgmb"] Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.044965 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xbgmb"] Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.063201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.085531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory\") pod \"2c3eb600-4864-4229-bbfc-6b24211fc914\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.085860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2lwr\" (UniqueName: \"kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr\") pod \"2c3eb600-4864-4229-bbfc-6b24211fc914\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.085908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam\") pod \"2c3eb600-4864-4229-bbfc-6b24211fc914\" (UID: \"2c3eb600-4864-4229-bbfc-6b24211fc914\") " Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.091403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr" (OuterVolumeSpecName: "kube-api-access-x2lwr") pod "2c3eb600-4864-4229-bbfc-6b24211fc914" (UID: "2c3eb600-4864-4229-bbfc-6b24211fc914"). InnerVolumeSpecName "kube-api-access-x2lwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.113297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory" (OuterVolumeSpecName: "inventory") pod "2c3eb600-4864-4229-bbfc-6b24211fc914" (UID: "2c3eb600-4864-4229-bbfc-6b24211fc914"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.122139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c3eb600-4864-4229-bbfc-6b24211fc914" (UID: "2c3eb600-4864-4229-bbfc-6b24211fc914"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.187957 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2lwr\" (UniqueName: \"kubernetes.io/projected/2c3eb600-4864-4229-bbfc-6b24211fc914-kube-api-access-x2lwr\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.188010 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.188029 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3eb600-4864-4229-bbfc-6b24211fc914-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.339439 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6538c07b-e233-452f-adeb-4a91300817de" path="/var/lib/kubelet/pods/6538c07b-e233-452f-adeb-4a91300817de/volumes" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.573382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" event={"ID":"2c3eb600-4864-4229-bbfc-6b24211fc914","Type":"ContainerDied","Data":"d55f0653d381d81b97147d025ffffc1c10c3fdec83b2e78e94e13851b4779278"} Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.573908 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d55f0653d381d81b97147d025ffffc1c10c3fdec83b2e78e94e13851b4779278" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.573453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-vjvzc" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.730575 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk"] Feb 25 07:48:19 crc kubenswrapper[4749]: E0225 07:48:19.731008 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118325bc-53b6-454b-b62d-3eebfe528076" containerName="oc" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.731027 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="118325bc-53b6-454b-b62d-3eebfe528076" containerName="oc" Feb 25 07:48:19 crc kubenswrapper[4749]: E0225 07:48:19.731038 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3eb600-4864-4229-bbfc-6b24211fc914" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.731047 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3eb600-4864-4229-bbfc-6b24211fc914" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.731215 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3eb600-4864-4229-bbfc-6b24211fc914" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.731235 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="118325bc-53b6-454b-b62d-3eebfe528076" containerName="oc" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.731869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.734972 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.735184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.735382 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.735660 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.752511 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk"] Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.804620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.804708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bghv\" (UniqueName: \"kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.804754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.906332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.906454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.906522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bghv\" (UniqueName: \"kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.909521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.909766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:19 crc kubenswrapper[4749]: I0225 07:48:19.924296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bghv\" (UniqueName: \"kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:20 crc kubenswrapper[4749]: I0225 07:48:20.028047 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7jw8g"] Feb 25 07:48:20 crc kubenswrapper[4749]: I0225 07:48:20.037794 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7jw8g"] Feb 25 07:48:20 crc kubenswrapper[4749]: I0225 07:48:20.049957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:48:20 crc kubenswrapper[4749]: I0225 07:48:20.322136 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:48:20 crc kubenswrapper[4749]: E0225 07:48:20.322522 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:48:20 crc kubenswrapper[4749]: I0225 07:48:20.588264 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.050148 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c449-account-create-update-rc7pl"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.065623 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7d49-account-create-update-2q4zh"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.082248 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c449-account-create-update-rc7pl"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.097790 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7gjg9"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.097852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-08a2-account-create-update-ng7wt"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.105557 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-08a2-account-create-update-ng7wt"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.113121 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7gjg9"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.122962 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7d49-account-create-update-2q4zh"] Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.335716 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2385f9b1-5956-4820-a03b-b9a7892c2e93" path="/var/lib/kubelet/pods/2385f9b1-5956-4820-a03b-b9a7892c2e93/volumes" Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.336311 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bc241f-11af-4075-b898-060df752a179" path="/var/lib/kubelet/pods/46bc241f-11af-4075-b898-060df752a179/volumes" Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.336928 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8184cd81-b73e-4815-8980-c117ecbddedb" path="/var/lib/kubelet/pods/8184cd81-b73e-4815-8980-c117ecbddedb/volumes" Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.337481 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f" path="/var/lib/kubelet/pods/b6c2c0a2-f3b4-4782-977d-e45ead5f0d3f/volumes" Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.338505 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406" path="/var/lib/kubelet/pods/c4b6bff5-789e-4d3f-bbe3-f5c52e4fe406/volumes" Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.605226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" event={"ID":"4237ec4c-49e0-4c6d-8a5c-d67583610f3d","Type":"ContainerStarted","Data":"0e4273973903299f15293762b933539165791703db881b679542fc085e8a4695"} Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.605289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" event={"ID":"4237ec4c-49e0-4c6d-8a5c-d67583610f3d","Type":"ContainerStarted","Data":"c92d29a0d78a8b7181bdb14941145f92384d59373ef72b00bbbb161d1da448cd"} Feb 25 07:48:21 crc kubenswrapper[4749]: I0225 07:48:21.635652 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" podStartSLOduration=2.238962395 podStartE2EDuration="2.635631611s" podCreationTimestamp="2026-02-25 07:48:19 +0000 UTC" firstStartedPulling="2026-02-25 07:48:20.589782156 +0000 UTC m=+1853.951608176" lastFinishedPulling="2026-02-25 07:48:20.986451362 +0000 UTC m=+1854.348277392" observedRunningTime="2026-02-25 07:48:21.625559477 +0000 UTC m=+1854.987385507" watchObservedRunningTime="2026-02-25 07:48:21.635631611 +0000 UTC m=+1854.997457641" Feb 25 07:48:35 crc kubenswrapper[4749]: I0225 07:48:35.322153 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:48:35 crc kubenswrapper[4749]: E0225 07:48:35.323086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.249956 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.252218 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.270592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.270754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4gp\" (UniqueName: \"kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.270789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.280078 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.373630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4gp\" (UniqueName: \"kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.373708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.373775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.374420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.374719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.402125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4gp\" (UniqueName: \"kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp\") pod \"community-operators-gnd8m\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:37 crc kubenswrapper[4749]: I0225 07:48:37.577945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:38 crc kubenswrapper[4749]: I0225 07:48:38.115321 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:38 crc kubenswrapper[4749]: I0225 07:48:38.788653 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerID="3a1bb7c598b84fc53ff2bf418021500403edc949604b9586f9d973246186cc05" exitCode=0 Feb 25 07:48:38 crc kubenswrapper[4749]: I0225 07:48:38.788749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerDied","Data":"3a1bb7c598b84fc53ff2bf418021500403edc949604b9586f9d973246186cc05"} Feb 25 07:48:38 crc kubenswrapper[4749]: I0225 07:48:38.788795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerStarted","Data":"f7ebd6dc25af1d2261f18e0870684909ded3ae666bb5b4bb810036560d631734"} Feb 25 07:48:39 crc kubenswrapper[4749]: I0225 07:48:39.801212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerStarted","Data":"19040d9d7703416162ba139345b975b5646e555b8d86c8ac487dd2483c594b26"} Feb 25 07:48:40 crc kubenswrapper[4749]: I0225 07:48:40.815314 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerID="19040d9d7703416162ba139345b975b5646e555b8d86c8ac487dd2483c594b26" exitCode=0 Feb 25 07:48:40 crc kubenswrapper[4749]: I0225 07:48:40.816354 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerDied","Data":"19040d9d7703416162ba139345b975b5646e555b8d86c8ac487dd2483c594b26"} Feb 25 07:48:41 crc kubenswrapper[4749]: I0225 07:48:41.828320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerStarted","Data":"74a74e59b83cb701fecd6e453868b2a38e024bc4c48bdeaadbbdcc6860aa2f97"} Feb 25 07:48:41 crc kubenswrapper[4749]: I0225 07:48:41.864272 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnd8m" podStartSLOduration=2.350679392 podStartE2EDuration="4.864247277s" podCreationTimestamp="2026-02-25 07:48:37 +0000 UTC" firstStartedPulling="2026-02-25 07:48:38.793206333 +0000 UTC m=+1872.155032353" lastFinishedPulling="2026-02-25 07:48:41.306774218 +0000 UTC m=+1874.668600238" observedRunningTime="2026-02-25 07:48:41.85858923 +0000 UTC m=+1875.220415240" watchObservedRunningTime="2026-02-25 07:48:41.864247277 +0000 UTC m=+1875.226073307" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.097441 4749 scope.go:117] "RemoveContainer" containerID="77586455684cc5d95908b1a9e4ec84950605b8ef4a2bccee1009dca9afe33fb9" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.150414 4749 scope.go:117] "RemoveContainer" containerID="fe2bda90df2d0745b8b46aef039eb990bc18fa0f303933d99ce5acdda64aa4a4" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.206205 4749 scope.go:117] "RemoveContainer" containerID="798752ffa2b19f79a2a814a2fec00dcc5ac69798af35f5feb24faac6367f319b" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.237219 4749 scope.go:117] "RemoveContainer" containerID="c8b89ebe4cda7eaf8a75165f9d9a1deaa05de4b7e647ebb6c8c4c82d7dce31a9" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.279713 4749 scope.go:117] "RemoveContainer" containerID="4a42adb1053b219dae8e3dca59d461530bd0a92fbf109abc5db7a14c1205307a" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.312692 4749 scope.go:117] "RemoveContainer" containerID="1c8f76209c44fb02dd1aef9d22bd8fe14eaee0205b0649039d4ea8b01f13672f" Feb 25 07:48:44 crc kubenswrapper[4749]: I0225 07:48:44.364185 4749 scope.go:117] "RemoveContainer" containerID="78a27a57542f77edef9ca8e2f387c44669bd84acb03d5da544768c9a1da03655" Feb 25 07:48:47 crc kubenswrapper[4749]: I0225 07:48:47.578677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:47 crc kubenswrapper[4749]: I0225 07:48:47.578979 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:47 crc kubenswrapper[4749]: I0225 07:48:47.640327 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:47 crc kubenswrapper[4749]: I0225 07:48:47.947291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:47 crc kubenswrapper[4749]: I0225 07:48:47.989469 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:48 crc kubenswrapper[4749]: I0225 07:48:48.322442 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:48:48 crc kubenswrapper[4749]: E0225 07:48:48.322895 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:48:49 crc kubenswrapper[4749]: I0225 07:48:49.917261 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gnd8m" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="registry-server" containerID="cri-o://74a74e59b83cb701fecd6e453868b2a38e024bc4c48bdeaadbbdcc6860aa2f97" gracePeriod=2 Feb 25 07:48:50 crc kubenswrapper[4749]: I0225 07:48:50.063475 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mlb5c"] Feb 25 07:48:50 crc kubenswrapper[4749]: I0225 07:48:50.071346 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mlb5c"] Feb 25 07:48:50 crc kubenswrapper[4749]: I0225 07:48:50.928039 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerID="74a74e59b83cb701fecd6e453868b2a38e024bc4c48bdeaadbbdcc6860aa2f97" exitCode=0 Feb 25 07:48:50 crc kubenswrapper[4749]: I0225 07:48:50.928083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerDied","Data":"74a74e59b83cb701fecd6e453868b2a38e024bc4c48bdeaadbbdcc6860aa2f97"} Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.334759 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f52cb2d-084a-4cf4-95c9-facab10be752" path="/var/lib/kubelet/pods/3f52cb2d-084a-4cf4-95c9-facab10be752/volumes" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.633122 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.666577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content\") pod \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.666641 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities\") pod \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.666830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf4gp\" (UniqueName: \"kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp\") pod \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\" (UID: \"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d\") " Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.667485 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities" (OuterVolumeSpecName: "utilities") pod "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" (UID: "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.672979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp" (OuterVolumeSpecName: "kube-api-access-jf4gp") pod "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" (UID: "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d"). InnerVolumeSpecName "kube-api-access-jf4gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.735391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" (UID: "3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.768152 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf4gp\" (UniqueName: \"kubernetes.io/projected/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-kube-api-access-jf4gp\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.768454 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.768470 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.941808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnd8m" event={"ID":"3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d","Type":"ContainerDied","Data":"f7ebd6dc25af1d2261f18e0870684909ded3ae666bb5b4bb810036560d631734"} Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.941879 4749 scope.go:117] "RemoveContainer" containerID="74a74e59b83cb701fecd6e453868b2a38e024bc4c48bdeaadbbdcc6860aa2f97" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.941888 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnd8m" Feb 25 07:48:51 crc kubenswrapper[4749]: I0225 07:48:51.971789 4749 scope.go:117] "RemoveContainer" containerID="19040d9d7703416162ba139345b975b5646e555b8d86c8ac487dd2483c594b26" Feb 25 07:48:52 crc kubenswrapper[4749]: I0225 07:48:52.027587 4749 scope.go:117] "RemoveContainer" containerID="3a1bb7c598b84fc53ff2bf418021500403edc949604b9586f9d973246186cc05" Feb 25 07:48:52 crc kubenswrapper[4749]: I0225 07:48:52.035955 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:52 crc kubenswrapper[4749]: I0225 07:48:52.047085 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gnd8m"] Feb 25 07:48:53 crc kubenswrapper[4749]: I0225 07:48:53.334969 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" path="/var/lib/kubelet/pods/3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d/volumes" Feb 25 07:49:02 crc kubenswrapper[4749]: I0225 07:49:02.322484 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:49:02 crc kubenswrapper[4749]: E0225 07:49:02.323280 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:49:10 crc kubenswrapper[4749]: I0225 07:49:10.161307 4749 generic.go:334] "Generic (PLEG): container finished" podID="4237ec4c-49e0-4c6d-8a5c-d67583610f3d" containerID="0e4273973903299f15293762b933539165791703db881b679542fc085e8a4695" exitCode=0 Feb 25 07:49:10 crc kubenswrapper[4749]: I0225 07:49:10.161467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" event={"ID":"4237ec4c-49e0-4c6d-8a5c-d67583610f3d","Type":"ContainerDied","Data":"0e4273973903299f15293762b933539165791703db881b679542fc085e8a4695"} Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.065390 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-n7jcl"] Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.083416 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-n7jcl"] Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.334432 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ee757a-870c-4c47-847c-bed7addddb21" path="/var/lib/kubelet/pods/58ee757a-870c-4c47-847c-bed7addddb21/volumes" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.683466 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.781718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bghv\" (UniqueName: \"kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv\") pod \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.782070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory\") pod \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.782121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam\") pod \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\" (UID: \"4237ec4c-49e0-4c6d-8a5c-d67583610f3d\") " Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.793805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv" (OuterVolumeSpecName: "kube-api-access-8bghv") pod "4237ec4c-49e0-4c6d-8a5c-d67583610f3d" (UID: "4237ec4c-49e0-4c6d-8a5c-d67583610f3d"). InnerVolumeSpecName "kube-api-access-8bghv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.807114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4237ec4c-49e0-4c6d-8a5c-d67583610f3d" (UID: "4237ec4c-49e0-4c6d-8a5c-d67583610f3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.811541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory" (OuterVolumeSpecName: "inventory") pod "4237ec4c-49e0-4c6d-8a5c-d67583610f3d" (UID: "4237ec4c-49e0-4c6d-8a5c-d67583610f3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.884438 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.884746 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:11 crc kubenswrapper[4749]: I0225 07:49:11.884836 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bghv\" (UniqueName: \"kubernetes.io/projected/4237ec4c-49e0-4c6d-8a5c-d67583610f3d-kube-api-access-8bghv\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.046764 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xxh4"] Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.056658 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xxh4"] Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.182306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" event={"ID":"4237ec4c-49e0-4c6d-8a5c-d67583610f3d","Type":"ContainerDied","Data":"c92d29a0d78a8b7181bdb14941145f92384d59373ef72b00bbbb161d1da448cd"} Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.182664 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92d29a0d78a8b7181bdb14941145f92384d59373ef72b00bbbb161d1da448cd" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.182402 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.290401 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w792"] Feb 25 07:49:12 crc kubenswrapper[4749]: E0225 07:49:12.290943 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="registry-server" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.290968 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="registry-server" Feb 25 07:49:12 crc kubenswrapper[4749]: E0225 07:49:12.290992 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4237ec4c-49e0-4c6d-8a5c-d67583610f3d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.291005 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4237ec4c-49e0-4c6d-8a5c-d67583610f3d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:12 crc kubenswrapper[4749]: E0225 07:49:12.291022 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="extract-content" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.291034 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="extract-content" Feb 25 07:49:12 crc kubenswrapper[4749]: E0225 07:49:12.291072 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="extract-utilities" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.291083 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="extract-utilities" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.291345 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4237ec4c-49e0-4c6d-8a5c-d67583610f3d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.291376 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9bd2d5-1e8c-47a3-ad31-c5d97e21604d" containerName="registry-server" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.292494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.294373 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.294682 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.297807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.306557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w792"] Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.308457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.392587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmt79\" (UniqueName: \"kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.394194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.394500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.497117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmt79\" (UniqueName: \"kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.497338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.497520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.501518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.501785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.535509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmt79\" (UniqueName: \"kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79\") pod \"ssh-known-hosts-edpm-deployment-2w792\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:12 crc kubenswrapper[4749]: I0225 07:49:12.626457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:13 crc kubenswrapper[4749]: I0225 07:49:13.173393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-2w792"] Feb 25 07:49:13 crc kubenswrapper[4749]: I0225 07:49:13.197792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" event={"ID":"357114b3-24d4-4f6f-ba27-99c4314d110d","Type":"ContainerStarted","Data":"3678fb566ab09b1db17c07630c06e1bc4356f2655099e60cd0be75ad3ffad86f"} Feb 25 07:49:13 crc kubenswrapper[4749]: I0225 07:49:13.322460 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:49:13 crc kubenswrapper[4749]: E0225 07:49:13.322883 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:49:13 crc kubenswrapper[4749]: I0225 07:49:13.343853 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0b654d-70c1-4ea5-b508-1c365f26720a" path="/var/lib/kubelet/pods/4b0b654d-70c1-4ea5-b508-1c365f26720a/volumes" Feb 25 07:49:14 crc kubenswrapper[4749]: I0225 07:49:14.212081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" event={"ID":"357114b3-24d4-4f6f-ba27-99c4314d110d","Type":"ContainerStarted","Data":"3d87645d810dcc85b15847d004de70c042213bfe039e6037d5875f96c0bfd303"} Feb 25 07:49:14 crc kubenswrapper[4749]: I0225 07:49:14.241042 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" podStartSLOduration=1.758967127 podStartE2EDuration="2.240991587s" podCreationTimestamp="2026-02-25 07:49:12 +0000 UTC" firstStartedPulling="2026-02-25 07:49:13.178648953 +0000 UTC m=+1906.540475013" lastFinishedPulling="2026-02-25 07:49:13.660673443 +0000 UTC m=+1907.022499473" observedRunningTime="2026-02-25 07:49:14.23987249 +0000 UTC m=+1907.601698510" watchObservedRunningTime="2026-02-25 07:49:14.240991587 +0000 UTC m=+1907.602817647" Feb 25 07:49:21 crc kubenswrapper[4749]: I0225 07:49:21.307048 4749 generic.go:334] "Generic (PLEG): container finished" podID="357114b3-24d4-4f6f-ba27-99c4314d110d" containerID="3d87645d810dcc85b15847d004de70c042213bfe039e6037d5875f96c0bfd303" exitCode=0 Feb 25 07:49:21 crc kubenswrapper[4749]: I0225 07:49:21.307121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" event={"ID":"357114b3-24d4-4f6f-ba27-99c4314d110d","Type":"ContainerDied","Data":"3d87645d810dcc85b15847d004de70c042213bfe039e6037d5875f96c0bfd303"} Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.841021 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.935337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmt79\" (UniqueName: \"kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79\") pod \"357114b3-24d4-4f6f-ba27-99c4314d110d\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.935478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam\") pod \"357114b3-24d4-4f6f-ba27-99c4314d110d\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.935535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0\") pod \"357114b3-24d4-4f6f-ba27-99c4314d110d\" (UID: \"357114b3-24d4-4f6f-ba27-99c4314d110d\") " Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.942114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79" (OuterVolumeSpecName: "kube-api-access-tmt79") pod "357114b3-24d4-4f6f-ba27-99c4314d110d" (UID: "357114b3-24d4-4f6f-ba27-99c4314d110d"). InnerVolumeSpecName "kube-api-access-tmt79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.971651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "357114b3-24d4-4f6f-ba27-99c4314d110d" (UID: "357114b3-24d4-4f6f-ba27-99c4314d110d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:22 crc kubenswrapper[4749]: I0225 07:49:22.991135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "357114b3-24d4-4f6f-ba27-99c4314d110d" (UID: "357114b3-24d4-4f6f-ba27-99c4314d110d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.037628 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmt79\" (UniqueName: \"kubernetes.io/projected/357114b3-24d4-4f6f-ba27-99c4314d110d-kube-api-access-tmt79\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.037669 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.037713 4749 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/357114b3-24d4-4f6f-ba27-99c4314d110d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.329688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.335037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-2w792" event={"ID":"357114b3-24d4-4f6f-ba27-99c4314d110d","Type":"ContainerDied","Data":"3678fb566ab09b1db17c07630c06e1bc4356f2655099e60cd0be75ad3ffad86f"} Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.335084 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3678fb566ab09b1db17c07630c06e1bc4356f2655099e60cd0be75ad3ffad86f" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.406626 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb"] Feb 25 07:49:23 crc kubenswrapper[4749]: E0225 07:49:23.407077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357114b3-24d4-4f6f-ba27-99c4314d110d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.407121 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="357114b3-24d4-4f6f-ba27-99c4314d110d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.407351 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="357114b3-24d4-4f6f-ba27-99c4314d110d" containerName="ssh-known-hosts-edpm-deployment" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.408189 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.412184 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.412798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.419726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb"] Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.423712 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.423718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.445620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4fwg\" (UniqueName: \"kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.445715 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.445812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.547509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.547631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4fwg\" (UniqueName: \"kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.547723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.552497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.553946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.563516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4fwg\" (UniqueName: \"kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nfvzb\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:23 crc kubenswrapper[4749]: I0225 07:49:23.741146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:24 crc kubenswrapper[4749]: I0225 07:49:24.279197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb"] Feb 25 07:49:24 crc kubenswrapper[4749]: I0225 07:49:24.322618 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:49:24 crc kubenswrapper[4749]: E0225 07:49:24.322906 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:49:24 crc kubenswrapper[4749]: I0225 07:49:24.340441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" event={"ID":"11ec0661-d541-4c78-bc67-3bcb2e908694","Type":"ContainerStarted","Data":"bf44b974f6fb19b378a15091c84c0a3ae79d64741f716d2e3cf9c3f4d4609082"} Feb 25 07:49:25 crc kubenswrapper[4749]: I0225 07:49:25.348876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" event={"ID":"11ec0661-d541-4c78-bc67-3bcb2e908694","Type":"ContainerStarted","Data":"e9351748605dea99b25143f0d3219b55c20287a88089e79bc3322aa7fb2bf66e"} Feb 25 07:49:25 crc kubenswrapper[4749]: I0225 07:49:25.366204 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" podStartSLOduration=1.9690704989999999 podStartE2EDuration="2.366185549s" podCreationTimestamp="2026-02-25 07:49:23 +0000 UTC" firstStartedPulling="2026-02-25 07:49:24.28746365 +0000 UTC m=+1917.649289680" lastFinishedPulling="2026-02-25 07:49:24.6845787 +0000 UTC m=+1918.046404730" observedRunningTime="2026-02-25 07:49:25.360218755 +0000 UTC m=+1918.722044785" watchObservedRunningTime="2026-02-25 07:49:25.366185549 +0000 UTC m=+1918.728011569" Feb 25 07:49:33 crc kubenswrapper[4749]: I0225 07:49:33.435641 4749 generic.go:334] "Generic (PLEG): container finished" podID="11ec0661-d541-4c78-bc67-3bcb2e908694" containerID="e9351748605dea99b25143f0d3219b55c20287a88089e79bc3322aa7fb2bf66e" exitCode=0 Feb 25 07:49:33 crc kubenswrapper[4749]: I0225 07:49:33.435764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" event={"ID":"11ec0661-d541-4c78-bc67-3bcb2e908694","Type":"ContainerDied","Data":"e9351748605dea99b25143f0d3219b55c20287a88089e79bc3322aa7fb2bf66e"} Feb 25 07:49:34 crc kubenswrapper[4749]: I0225 07:49:34.887696 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:34 crc kubenswrapper[4749]: I0225 07:49:34.989676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4fwg\" (UniqueName: \"kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg\") pod \"11ec0661-d541-4c78-bc67-3bcb2e908694\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " Feb 25 07:49:34 crc kubenswrapper[4749]: I0225 07:49:34.989790 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam\") pod \"11ec0661-d541-4c78-bc67-3bcb2e908694\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " Feb 25 07:49:34 crc kubenswrapper[4749]: I0225 07:49:34.989981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory\") pod \"11ec0661-d541-4c78-bc67-3bcb2e908694\" (UID: \"11ec0661-d541-4c78-bc67-3bcb2e908694\") " Feb 25 07:49:34 crc kubenswrapper[4749]: I0225 07:49:34.995925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg" (OuterVolumeSpecName: "kube-api-access-f4fwg") pod "11ec0661-d541-4c78-bc67-3bcb2e908694" (UID: "11ec0661-d541-4c78-bc67-3bcb2e908694"). InnerVolumeSpecName "kube-api-access-f4fwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.016218 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory" (OuterVolumeSpecName: "inventory") pod "11ec0661-d541-4c78-bc67-3bcb2e908694" (UID: "11ec0661-d541-4c78-bc67-3bcb2e908694"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.018199 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "11ec0661-d541-4c78-bc67-3bcb2e908694" (UID: "11ec0661-d541-4c78-bc67-3bcb2e908694"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.092302 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4fwg\" (UniqueName: \"kubernetes.io/projected/11ec0661-d541-4c78-bc67-3bcb2e908694-kube-api-access-f4fwg\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.092342 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.092358 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11ec0661-d541-4c78-bc67-3bcb2e908694-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.459937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" event={"ID":"11ec0661-d541-4c78-bc67-3bcb2e908694","Type":"ContainerDied","Data":"bf44b974f6fb19b378a15091c84c0a3ae79d64741f716d2e3cf9c3f4d4609082"} Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.459996 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nfvzb" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.460004 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf44b974f6fb19b378a15091c84c0a3ae79d64741f716d2e3cf9c3f4d4609082" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.548391 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd"] Feb 25 07:49:35 crc kubenswrapper[4749]: E0225 07:49:35.549023 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ec0661-d541-4c78-bc67-3bcb2e908694" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.549053 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ec0661-d541-4c78-bc67-3bcb2e908694" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.549377 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ec0661-d541-4c78-bc67-3bcb2e908694" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.550453 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.554333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.554343 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.554483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.556611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd"] Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.558724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.601518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.601705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.601904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnp4\" (UniqueName: \"kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.704252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.704314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.704371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnp4\" (UniqueName: \"kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.708668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.710488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.726125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnp4\" (UniqueName: \"kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:35 crc kubenswrapper[4749]: I0225 07:49:35.871977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:36 crc kubenswrapper[4749]: I0225 07:49:36.497397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd"] Feb 25 07:49:37 crc kubenswrapper[4749]: I0225 07:49:37.336911 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:49:37 crc kubenswrapper[4749]: E0225 07:49:37.337983 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:49:37 crc kubenswrapper[4749]: I0225 07:49:37.483615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" event={"ID":"daf53e51-e69a-43fd-bfa4-50ffcd4c9234","Type":"ContainerStarted","Data":"a135496323e57fbc2828cd992ba57fc1033d1b7354d4f88315830384130a0a0c"} Feb 25 07:49:37 crc kubenswrapper[4749]: I0225 07:49:37.484006 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" event={"ID":"daf53e51-e69a-43fd-bfa4-50ffcd4c9234","Type":"ContainerStarted","Data":"089f3a919b8e436f5f97b83eee3fc63c415582ab68a2b702b873b56e33d54990"} Feb 25 07:49:37 crc kubenswrapper[4749]: I0225 07:49:37.505174 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" podStartSLOduration=1.91873059 podStartE2EDuration="2.505154521s" podCreationTimestamp="2026-02-25 07:49:35 +0000 UTC" firstStartedPulling="2026-02-25 07:49:36.487904767 +0000 UTC m=+1929.849730787" lastFinishedPulling="2026-02-25 07:49:37.074328658 +0000 UTC m=+1930.436154718" observedRunningTime="2026-02-25 07:49:37.504774942 +0000 UTC m=+1930.866601042" watchObservedRunningTime="2026-02-25 07:49:37.505154521 +0000 UTC m=+1930.866980541" Feb 25 07:49:44 crc kubenswrapper[4749]: I0225 07:49:44.538294 4749 scope.go:117] "RemoveContainer" containerID="2a450491912714d76d5fadba04571a48c41b9b9a27ae123cc77f3021d6fafad3" Feb 25 07:49:44 crc kubenswrapper[4749]: I0225 07:49:44.626702 4749 scope.go:117] "RemoveContainer" containerID="1f08ac38d6d2521c0f4244d5bc80a4fe0909319c97570ac09afd06277a253b74" Feb 25 07:49:44 crc kubenswrapper[4749]: I0225 07:49:44.681376 4749 scope.go:117] "RemoveContainer" containerID="1b127a38cf4d7012536adf90f4ac9dd659ec816d4b71abb427527ef47dcd37d3" Feb 25 07:49:46 crc kubenswrapper[4749]: I0225 07:49:46.567944 4749 generic.go:334] "Generic (PLEG): container finished" podID="daf53e51-e69a-43fd-bfa4-50ffcd4c9234" containerID="a135496323e57fbc2828cd992ba57fc1033d1b7354d4f88315830384130a0a0c" exitCode=0 Feb 25 07:49:46 crc kubenswrapper[4749]: I0225 07:49:46.568023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" event={"ID":"daf53e51-e69a-43fd-bfa4-50ffcd4c9234","Type":"ContainerDied","Data":"a135496323e57fbc2828cd992ba57fc1033d1b7354d4f88315830384130a0a0c"} Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.001780 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.058136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam\") pod \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.058186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory\") pod \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.059554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpnp4\" (UniqueName: \"kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4\") pod \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\" (UID: \"daf53e51-e69a-43fd-bfa4-50ffcd4c9234\") " Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.066018 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4" (OuterVolumeSpecName: "kube-api-access-vpnp4") pod "daf53e51-e69a-43fd-bfa4-50ffcd4c9234" (UID: "daf53e51-e69a-43fd-bfa4-50ffcd4c9234"). InnerVolumeSpecName "kube-api-access-vpnp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.088540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "daf53e51-e69a-43fd-bfa4-50ffcd4c9234" (UID: "daf53e51-e69a-43fd-bfa4-50ffcd4c9234"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.091212 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory" (OuterVolumeSpecName: "inventory") pod "daf53e51-e69a-43fd-bfa4-50ffcd4c9234" (UID: "daf53e51-e69a-43fd-bfa4-50ffcd4c9234"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.163772 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.163821 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.163838 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpnp4\" (UniqueName: \"kubernetes.io/projected/daf53e51-e69a-43fd-bfa4-50ffcd4c9234-kube-api-access-vpnp4\") on node \"crc\" DevicePath \"\"" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.322652 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:49:48 crc kubenswrapper[4749]: E0225 07:49:48.323016 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.593489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" event={"ID":"daf53e51-e69a-43fd-bfa4-50ffcd4c9234","Type":"ContainerDied","Data":"089f3a919b8e436f5f97b83eee3fc63c415582ab68a2b702b873b56e33d54990"} Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.593534 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089f3a919b8e436f5f97b83eee3fc63c415582ab68a2b702b873b56e33d54990" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.593554 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.769820 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz"] Feb 25 07:49:48 crc kubenswrapper[4749]: E0225 07:49:48.770265 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf53e51-e69a-43fd-bfa4-50ffcd4c9234" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.770287 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf53e51-e69a-43fd-bfa4-50ffcd4c9234" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.770514 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf53e51-e69a-43fd-bfa4-50ffcd4c9234" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.771355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.774193 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.774400 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.774839 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.775005 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.775239 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.775701 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.775888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.776061 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.780634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz"] Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883579 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883792 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvrwh\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.883960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.884529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986303 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvrwh\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.986671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.991950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.992846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.993920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.994243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.994350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.994378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.994444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.995317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.995865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.996130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:48 crc kubenswrapper[4749]: I0225 07:49:48.999451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:49 crc kubenswrapper[4749]: I0225 07:49:49.000328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:49 crc kubenswrapper[4749]: I0225 07:49:49.002105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:49 crc kubenswrapper[4749]: I0225 07:49:49.013552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvrwh\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:49 crc kubenswrapper[4749]: I0225 07:49:49.094764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:49:49 crc kubenswrapper[4749]: I0225 07:49:49.670352 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz"] Feb 25 07:49:49 crc kubenswrapper[4749]: W0225 07:49:49.676543 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e445ab6_1f18_49fe_b3f4_0921714e4d08.slice/crio-230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63 WatchSource:0}: Error finding container 230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63: Status 404 returned error can't find the container with id 230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63 Feb 25 07:49:50 crc kubenswrapper[4749]: I0225 07:49:50.615565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" event={"ID":"7e445ab6-1f18-49fe-b3f4-0921714e4d08","Type":"ContainerStarted","Data":"6e85d99044d16bc9a3908d09e6265e0d86c170f091cf089066ea33ae0308e233"} Feb 25 07:49:50 crc kubenswrapper[4749]: I0225 07:49:50.615821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" event={"ID":"7e445ab6-1f18-49fe-b3f4-0921714e4d08","Type":"ContainerStarted","Data":"230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63"} Feb 25 07:49:50 crc kubenswrapper[4749]: I0225 07:49:50.651741 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" podStartSLOduration=2.235269599 podStartE2EDuration="2.651718656s" podCreationTimestamp="2026-02-25 07:49:48 +0000 UTC" firstStartedPulling="2026-02-25 07:49:49.680895032 +0000 UTC m=+1943.042721062" lastFinishedPulling="2026-02-25 07:49:50.097344089 +0000 UTC m=+1943.459170119" observedRunningTime="2026-02-25 07:49:50.644564924 +0000 UTC m=+1944.006390944" watchObservedRunningTime="2026-02-25 07:49:50.651718656 +0000 UTC m=+1944.013544686" Feb 25 07:49:57 crc kubenswrapper[4749]: I0225 07:49:57.041723 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pcvmq"] Feb 25 07:49:57 crc kubenswrapper[4749]: I0225 07:49:57.049258 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pcvmq"] Feb 25 07:49:57 crc kubenswrapper[4749]: I0225 07:49:57.371462 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5010dcd-fe23-46b8-8df4-f955ce98e324" path="/var/lib/kubelet/pods/d5010dcd-fe23-46b8-8df4-f955ce98e324/volumes" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.134093 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533430-sxqcb"] Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.135919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.139016 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.139161 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.140091 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.146107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533430-sxqcb"] Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.223239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5zxz\" (UniqueName: \"kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz\") pod \"auto-csr-approver-29533430-sxqcb\" (UID: \"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72\") " pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.325110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5zxz\" (UniqueName: \"kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz\") pod \"auto-csr-approver-29533430-sxqcb\" (UID: \"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72\") " pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.343024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5zxz\" (UniqueName: \"kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz\") pod \"auto-csr-approver-29533430-sxqcb\" (UID: \"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72\") " pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.454216 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:00 crc kubenswrapper[4749]: I0225 07:50:00.889003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533430-sxqcb"] Feb 25 07:50:01 crc kubenswrapper[4749]: I0225 07:50:01.804130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" event={"ID":"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72","Type":"ContainerStarted","Data":"2fa5dbbbc09b7f6b2b46dcd3392c53f7002f3ea0eb7fb6e8d930240e477a4907"} Feb 25 07:50:02 crc kubenswrapper[4749]: I0225 07:50:02.817430 4749 generic.go:334] "Generic (PLEG): container finished" podID="3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" containerID="719070ce472a459a1ad95b1b86000193936c1d21bb1b30fb00e97c567d49ff0c" exitCode=0 Feb 25 07:50:02 crc kubenswrapper[4749]: I0225 07:50:02.817526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" event={"ID":"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72","Type":"ContainerDied","Data":"719070ce472a459a1ad95b1b86000193936c1d21bb1b30fb00e97c567d49ff0c"} Feb 25 07:50:03 crc kubenswrapper[4749]: I0225 07:50:03.322392 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:50:03 crc kubenswrapper[4749]: I0225 07:50:03.826672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673"} Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.159582 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.311371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5zxz\" (UniqueName: \"kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz\") pod \"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72\" (UID: \"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72\") " Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.316819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz" (OuterVolumeSpecName: "kube-api-access-h5zxz") pod "3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" (UID: "3461bb05-8639-4ffd-bb8d-7e5fdb09ed72"). InnerVolumeSpecName "kube-api-access-h5zxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.414869 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5zxz\" (UniqueName: \"kubernetes.io/projected/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72-kube-api-access-h5zxz\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.838043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" event={"ID":"3461bb05-8639-4ffd-bb8d-7e5fdb09ed72","Type":"ContainerDied","Data":"2fa5dbbbc09b7f6b2b46dcd3392c53f7002f3ea0eb7fb6e8d930240e477a4907"} Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.838098 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa5dbbbc09b7f6b2b46dcd3392c53f7002f3ea0eb7fb6e8d930240e477a4907" Feb 25 07:50:04 crc kubenswrapper[4749]: I0225 07:50:04.838166 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533430-sxqcb" Feb 25 07:50:05 crc kubenswrapper[4749]: I0225 07:50:05.243273 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533424-859g2"] Feb 25 07:50:05 crc kubenswrapper[4749]: I0225 07:50:05.252198 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533424-859g2"] Feb 25 07:50:05 crc kubenswrapper[4749]: I0225 07:50:05.336290 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4a095a-b2c0-4f5b-9c6f-36438f2570b9" path="/var/lib/kubelet/pods/7c4a095a-b2c0-4f5b-9c6f-36438f2570b9/volumes" Feb 25 07:50:28 crc kubenswrapper[4749]: I0225 07:50:28.079211 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e445ab6-1f18-49fe-b3f4-0921714e4d08" containerID="6e85d99044d16bc9a3908d09e6265e0d86c170f091cf089066ea33ae0308e233" exitCode=0 Feb 25 07:50:28 crc kubenswrapper[4749]: I0225 07:50:28.079286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" event={"ID":"7e445ab6-1f18-49fe-b3f4-0921714e4d08","Type":"ContainerDied","Data":"6e85d99044d16bc9a3908d09e6265e0d86c170f091cf089066ea33ae0308e233"} Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.559061 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.644968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvrwh\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.645416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle\") pod \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\" (UID: \"7e445ab6-1f18-49fe-b3f4-0921714e4d08\") " Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.654774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.656833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.657859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh" (OuterVolumeSpecName: "kube-api-access-cvrwh") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "kube-api-access-cvrwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.657864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.657965 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.658033 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.658129 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.663139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.663584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.664062 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.664104 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.664215 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.696826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory" (OuterVolumeSpecName: "inventory") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.714645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e445ab6-1f18-49fe-b3f4-0921714e4d08" (UID: "7e445ab6-1f18-49fe-b3f4-0921714e4d08"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756251 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756293 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756332 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756345 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756358 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756369 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756408 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756426 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756438 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756449 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756484 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756503 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756516 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e445ab6-1f18-49fe-b3f4-0921714e4d08-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:29 crc kubenswrapper[4749]: I0225 07:50:29.756527 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvrwh\" (UniqueName: \"kubernetes.io/projected/7e445ab6-1f18-49fe-b3f4-0921714e4d08-kube-api-access-cvrwh\") on node \"crc\" DevicePath \"\"" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.104697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" event={"ID":"7e445ab6-1f18-49fe-b3f4-0921714e4d08","Type":"ContainerDied","Data":"230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63"} Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.104765 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="230844c67c0cf3656ad28bdebb0d91a97e9291fab42ba717fc821300622c8b63" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.104849 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.281698 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h"] Feb 25 07:50:30 crc kubenswrapper[4749]: E0225 07:50:30.282408 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" containerName="oc" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.282446 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" containerName="oc" Feb 25 07:50:30 crc kubenswrapper[4749]: E0225 07:50:30.282492 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e445ab6-1f18-49fe-b3f4-0921714e4d08" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.282514 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e445ab6-1f18-49fe-b3f4-0921714e4d08" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.283031 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e445ab6-1f18-49fe-b3f4-0921714e4d08" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.283087 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" containerName="oc" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.284418 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.288095 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.288407 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.288929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.289964 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.290230 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.295579 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h"] Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.367179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.367260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jlv7\" (UniqueName: \"kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.367435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.367474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.367732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.469420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.470439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.470516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jlv7\" (UniqueName: \"kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.470563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.470587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.471727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.475512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.477176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.477346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.495091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jlv7\" (UniqueName: \"kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5nv2h\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:30 crc kubenswrapper[4749]: I0225 07:50:30.622055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:50:31 crc kubenswrapper[4749]: I0225 07:50:31.266012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h"] Feb 25 07:50:32 crc kubenswrapper[4749]: I0225 07:50:32.124916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" event={"ID":"4671b264-81d8-4dfb-9bb9-33a1f2c46068","Type":"ContainerStarted","Data":"1a05b0a9ecf41a178d2197339b749823f93cfcac4406c6eb37dae822ad9848b0"} Feb 25 07:50:32 crc kubenswrapper[4749]: I0225 07:50:32.125577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" event={"ID":"4671b264-81d8-4dfb-9bb9-33a1f2c46068","Type":"ContainerStarted","Data":"04c6528bb137a77ce34d0ef9523d831c09952a44e51ac6b02dc971e8299412e0"} Feb 25 07:50:32 crc kubenswrapper[4749]: I0225 07:50:32.145530 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" podStartSLOduration=1.697423221 podStartE2EDuration="2.145510372s" podCreationTimestamp="2026-02-25 07:50:30 +0000 UTC" firstStartedPulling="2026-02-25 07:50:31.281230021 +0000 UTC m=+1984.643056081" lastFinishedPulling="2026-02-25 07:50:31.729317172 +0000 UTC m=+1985.091143232" observedRunningTime="2026-02-25 07:50:32.140294315 +0000 UTC m=+1985.502120346" watchObservedRunningTime="2026-02-25 07:50:32.145510372 +0000 UTC m=+1985.507336402" Feb 25 07:50:44 crc kubenswrapper[4749]: I0225 07:50:44.796274 4749 scope.go:117] "RemoveContainer" containerID="0516d6b693dd90cb486eb534267e8f17e563026993c67fb617028dd39bce440f" Feb 25 07:50:44 crc kubenswrapper[4749]: I0225 07:50:44.842139 4749 scope.go:117] "RemoveContainer" containerID="2f193ce21ec70e7769bb6c86b17aa99fc42a718f5c1d1010076f471ea5987ebd" Feb 25 07:51:32 crc kubenswrapper[4749]: I0225 07:51:32.761180 4749 generic.go:334] "Generic (PLEG): container finished" podID="4671b264-81d8-4dfb-9bb9-33a1f2c46068" containerID="1a05b0a9ecf41a178d2197339b749823f93cfcac4406c6eb37dae822ad9848b0" exitCode=0 Feb 25 07:51:32 crc kubenswrapper[4749]: I0225 07:51:32.761715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" event={"ID":"4671b264-81d8-4dfb-9bb9-33a1f2c46068","Type":"ContainerDied","Data":"1a05b0a9ecf41a178d2197339b749823f93cfcac4406c6eb37dae822ad9848b0"} Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.188308 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.370022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam\") pod \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.370187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory\") pod \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.370287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle\") pod \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.370351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0\") pod \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.370414 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jlv7\" (UniqueName: \"kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7\") pod \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\" (UID: \"4671b264-81d8-4dfb-9bb9-33a1f2c46068\") " Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.378471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4671b264-81d8-4dfb-9bb9-33a1f2c46068" (UID: "4671b264-81d8-4dfb-9bb9-33a1f2c46068"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.379100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7" (OuterVolumeSpecName: "kube-api-access-5jlv7") pod "4671b264-81d8-4dfb-9bb9-33a1f2c46068" (UID: "4671b264-81d8-4dfb-9bb9-33a1f2c46068"). InnerVolumeSpecName "kube-api-access-5jlv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.405683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory" (OuterVolumeSpecName: "inventory") pod "4671b264-81d8-4dfb-9bb9-33a1f2c46068" (UID: "4671b264-81d8-4dfb-9bb9-33a1f2c46068"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.409537 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4671b264-81d8-4dfb-9bb9-33a1f2c46068" (UID: "4671b264-81d8-4dfb-9bb9-33a1f2c46068"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.412925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4671b264-81d8-4dfb-9bb9-33a1f2c46068" (UID: "4671b264-81d8-4dfb-9bb9-33a1f2c46068"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.473444 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jlv7\" (UniqueName: \"kubernetes.io/projected/4671b264-81d8-4dfb-9bb9-33a1f2c46068-kube-api-access-5jlv7\") on node \"crc\" DevicePath \"\"" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.474678 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.474698 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.474717 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.474730 4749 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4671b264-81d8-4dfb-9bb9-33a1f2c46068-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.779685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" event={"ID":"4671b264-81d8-4dfb-9bb9-33a1f2c46068","Type":"ContainerDied","Data":"04c6528bb137a77ce34d0ef9523d831c09952a44e51ac6b02dc971e8299412e0"} Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.779747 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c6528bb137a77ce34d0ef9523d831c09952a44e51ac6b02dc971e8299412e0" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.779818 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5nv2h" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.968655 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95"] Feb 25 07:51:34 crc kubenswrapper[4749]: E0225 07:51:34.969422 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4671b264-81d8-4dfb-9bb9-33a1f2c46068" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.969442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4671b264-81d8-4dfb-9bb9-33a1f2c46068" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.969666 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4671b264-81d8-4dfb-9bb9-33a1f2c46068" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.970344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.972106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.973052 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.973264 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.973392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.974672 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.975104 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.982805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.982870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.982990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75pw9\" (UniqueName: \"kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.983034 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.983134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.983235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:34 crc kubenswrapper[4749]: I0225 07:51:34.995714 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95"] Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75pw9\" (UniqueName: \"kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.084463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.090326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.091093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.092303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.092618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.102572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.108414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75pw9\" (UniqueName: \"kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.291767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:51:35 crc kubenswrapper[4749]: I0225 07:51:35.874135 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95"] Feb 25 07:51:36 crc kubenswrapper[4749]: I0225 07:51:36.798393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" event={"ID":"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f","Type":"ContainerStarted","Data":"103af9fad2e4826cd15216f2d8330ef12c2db26f8715ecc6d7d907e864f4c991"} Feb 25 07:51:36 crc kubenswrapper[4749]: I0225 07:51:36.799104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" event={"ID":"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f","Type":"ContainerStarted","Data":"0b303f56c951c8686618b3fba2f2c305e43d29a94bf0605f50b006febdbef71a"} Feb 25 07:51:36 crc kubenswrapper[4749]: I0225 07:51:36.816267 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" podStartSLOduration=2.321136408 podStartE2EDuration="2.81625139s" podCreationTimestamp="2026-02-25 07:51:34 +0000 UTC" firstStartedPulling="2026-02-25 07:51:35.878793431 +0000 UTC m=+2049.240619451" lastFinishedPulling="2026-02-25 07:51:36.373908373 +0000 UTC m=+2049.735734433" observedRunningTime="2026-02-25 07:51:36.812286734 +0000 UTC m=+2050.174112754" watchObservedRunningTime="2026-02-25 07:51:36.81625139 +0000 UTC m=+2050.178077410" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.144586 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533432-jt8z4"] Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.146700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.150292 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.151254 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.156632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533432-jt8z4"] Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.156871 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.306749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnms6\" (UniqueName: \"kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6\") pod \"auto-csr-approver-29533432-jt8z4\" (UID: \"f2d09c41-2f15-4fde-92a4-2c6a759e7173\") " pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.409025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnms6\" (UniqueName: \"kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6\") pod \"auto-csr-approver-29533432-jt8z4\" (UID: \"f2d09c41-2f15-4fde-92a4-2c6a759e7173\") " pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.432194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnms6\" (UniqueName: \"kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6\") pod \"auto-csr-approver-29533432-jt8z4\" (UID: \"f2d09c41-2f15-4fde-92a4-2c6a759e7173\") " pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.465812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:00 crc kubenswrapper[4749]: I0225 07:52:00.934407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533432-jt8z4"] Feb 25 07:52:01 crc kubenswrapper[4749]: I0225 07:52:01.041186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" event={"ID":"f2d09c41-2f15-4fde-92a4-2c6a759e7173","Type":"ContainerStarted","Data":"79fff66083dcb5caf71946f4ca993dbf2b057afe3f24b58d7a8b6f3b8d806b52"} Feb 25 07:52:03 crc kubenswrapper[4749]: I0225 07:52:03.063640 4749 generic.go:334] "Generic (PLEG): container finished" podID="f2d09c41-2f15-4fde-92a4-2c6a759e7173" containerID="f50f1aad5fa998d067df9f1bfedd4f87bb53314d16c73f67ce09da4242d4e6d9" exitCode=0 Feb 25 07:52:03 crc kubenswrapper[4749]: I0225 07:52:03.063753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" event={"ID":"f2d09c41-2f15-4fde-92a4-2c6a759e7173","Type":"ContainerDied","Data":"f50f1aad5fa998d067df9f1bfedd4f87bb53314d16c73f67ce09da4242d4e6d9"} Feb 25 07:52:04 crc kubenswrapper[4749]: I0225 07:52:04.410445 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:04 crc kubenswrapper[4749]: I0225 07:52:04.587101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnms6\" (UniqueName: \"kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6\") pod \"f2d09c41-2f15-4fde-92a4-2c6a759e7173\" (UID: \"f2d09c41-2f15-4fde-92a4-2c6a759e7173\") " Feb 25 07:52:04 crc kubenswrapper[4749]: I0225 07:52:04.594714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6" (OuterVolumeSpecName: "kube-api-access-lnms6") pod "f2d09c41-2f15-4fde-92a4-2c6a759e7173" (UID: "f2d09c41-2f15-4fde-92a4-2c6a759e7173"). InnerVolumeSpecName "kube-api-access-lnms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:52:04 crc kubenswrapper[4749]: I0225 07:52:04.690390 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnms6\" (UniqueName: \"kubernetes.io/projected/f2d09c41-2f15-4fde-92a4-2c6a759e7173-kube-api-access-lnms6\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:05 crc kubenswrapper[4749]: I0225 07:52:05.084882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" event={"ID":"f2d09c41-2f15-4fde-92a4-2c6a759e7173","Type":"ContainerDied","Data":"79fff66083dcb5caf71946f4ca993dbf2b057afe3f24b58d7a8b6f3b8d806b52"} Feb 25 07:52:05 crc kubenswrapper[4749]: I0225 07:52:05.085276 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79fff66083dcb5caf71946f4ca993dbf2b057afe3f24b58d7a8b6f3b8d806b52" Feb 25 07:52:05 crc kubenswrapper[4749]: I0225 07:52:05.085001 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533432-jt8z4" Feb 25 07:52:05 crc kubenswrapper[4749]: I0225 07:52:05.490550 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533426-5xgxf"] Feb 25 07:52:05 crc kubenswrapper[4749]: I0225 07:52:05.498317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533426-5xgxf"] Feb 25 07:52:07 crc kubenswrapper[4749]: I0225 07:52:07.336423 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a389a4a6-b001-424f-af2d-058cc85fc419" path="/var/lib/kubelet/pods/a389a4a6-b001-424f-af2d-058cc85fc419/volumes" Feb 25 07:52:21 crc kubenswrapper[4749]: I0225 07:52:21.672319 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:52:21 crc kubenswrapper[4749]: I0225 07:52:21.673294 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:52:24 crc kubenswrapper[4749]: I0225 07:52:24.287277 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" containerID="103af9fad2e4826cd15216f2d8330ef12c2db26f8715ecc6d7d907e864f4c991" exitCode=0 Feb 25 07:52:24 crc kubenswrapper[4749]: I0225 07:52:24.287384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" event={"ID":"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f","Type":"ContainerDied","Data":"103af9fad2e4826cd15216f2d8330ef12c2db26f8715ecc6d7d907e864f4c991"} Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.796790 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.986521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75pw9\" (UniqueName: \"kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.986595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.986736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.987545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.987644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.987700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle\") pod \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\" (UID: \"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f\") " Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.994863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:52:25 crc kubenswrapper[4749]: I0225 07:52:25.996082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9" (OuterVolumeSpecName: "kube-api-access-75pw9") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "kube-api-access-75pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.034242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.049472 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.049947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory" (OuterVolumeSpecName: "inventory") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.052013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" (UID: "f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090349 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75pw9\" (UniqueName: \"kubernetes.io/projected/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-kube-api-access-75pw9\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090403 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090421 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090441 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090461 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.090478 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.315489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" event={"ID":"f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f","Type":"ContainerDied","Data":"0b303f56c951c8686618b3fba2f2c305e43d29a94bf0605f50b006febdbef71a"} Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.315988 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b303f56c951c8686618b3fba2f2c305e43d29a94bf0605f50b006febdbef71a" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.315667 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.454691 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw"] Feb 25 07:52:26 crc kubenswrapper[4749]: E0225 07:52:26.455346 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.455384 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 07:52:26 crc kubenswrapper[4749]: E0225 07:52:26.455436 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d09c41-2f15-4fde-92a4-2c6a759e7173" containerName="oc" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.455450 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d09c41-2f15-4fde-92a4-2c6a759e7173" containerName="oc" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.455843 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.455898 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d09c41-2f15-4fde-92a4-2c6a759e7173" containerName="oc" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.456962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.463224 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.463531 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.463797 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.463794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.464057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.467969 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw"] Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.500963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.501058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76dt\" (UniqueName: \"kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.501100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.501130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.501148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.603148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.603292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76dt\" (UniqueName: \"kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.603347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.603386 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.603413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.609736 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.611689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.612021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.616117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.622275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76dt\" (UniqueName: \"kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:26 crc kubenswrapper[4749]: I0225 07:52:26.796170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:52:27 crc kubenswrapper[4749]: I0225 07:52:27.122982 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw"] Feb 25 07:52:27 crc kubenswrapper[4749]: I0225 07:52:27.340402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" event={"ID":"04b421fd-689e-4212-85a9-ffaecfe63fbe","Type":"ContainerStarted","Data":"75161fb1d698a4c7d700aa442d0e79f9033d50fa24576f256131e4bfd2b73242"} Feb 25 07:52:27 crc kubenswrapper[4749]: I0225 07:52:27.512895 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:52:28 crc kubenswrapper[4749]: I0225 07:52:28.337062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" event={"ID":"04b421fd-689e-4212-85a9-ffaecfe63fbe","Type":"ContainerStarted","Data":"f506f6319c968a756b90235936ff13981a96dea3827a92a26c55e8d8834556df"} Feb 25 07:52:28 crc kubenswrapper[4749]: I0225 07:52:28.359817 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" podStartSLOduration=1.973283959 podStartE2EDuration="2.359796356s" podCreationTimestamp="2026-02-25 07:52:26 +0000 UTC" firstStartedPulling="2026-02-25 07:52:27.121356867 +0000 UTC m=+2100.483182887" lastFinishedPulling="2026-02-25 07:52:27.507869224 +0000 UTC m=+2100.869695284" observedRunningTime="2026-02-25 07:52:28.35291446 +0000 UTC m=+2101.714740490" watchObservedRunningTime="2026-02-25 07:52:28.359796356 +0000 UTC m=+2101.721622386" Feb 25 07:52:35 crc kubenswrapper[4749]: I0225 07:52:35.981305 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:35 crc kubenswrapper[4749]: I0225 07:52:35.987814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.002171 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.105733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.105808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98kbk\" (UniqueName: \"kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.105901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.207402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.207770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.207898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98kbk\" (UniqueName: \"kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.208040 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.208119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.241220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98kbk\" (UniqueName: \"kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk\") pod \"certified-operators-gwfqs\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:36 crc kubenswrapper[4749]: I0225 07:52:36.321747 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:37 crc kubenswrapper[4749]: I0225 07:52:37.549999 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pk7lm container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 07:52:37 crc kubenswrapper[4749]: I0225 07:52:37.551736 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" podUID="e899a950-b7af-4fe2-b9db-856858e051fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:52:37 crc kubenswrapper[4749]: I0225 07:52:37.678623 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pk7lm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 25 07:52:37 crc kubenswrapper[4749]: I0225 07:52:37.678698 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pk7lm" podUID="e899a950-b7af-4fe2-b9db-856858e051fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 25 07:52:37 crc kubenswrapper[4749]: I0225 07:52:37.919276 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:38 crc kubenswrapper[4749]: I0225 07:52:38.448974 4749 generic.go:334] "Generic (PLEG): container finished" podID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerID="d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865" exitCode=0 Feb 25 07:52:38 crc kubenswrapper[4749]: I0225 07:52:38.449237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerDied","Data":"d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865"} Feb 25 07:52:38 crc kubenswrapper[4749]: I0225 07:52:38.449263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerStarted","Data":"a9d4f93ace04470ed9cb630dcf97c2c4efa63c65066cd00a10baa59a428bacc9"} Feb 25 07:52:38 crc kubenswrapper[4749]: I0225 07:52:38.451228 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:52:40 crc kubenswrapper[4749]: I0225 07:52:40.472425 4749 generic.go:334] "Generic (PLEG): container finished" podID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerID="e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40" exitCode=0 Feb 25 07:52:40 crc kubenswrapper[4749]: I0225 07:52:40.472533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerDied","Data":"e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40"} Feb 25 07:52:41 crc kubenswrapper[4749]: I0225 07:52:41.486415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerStarted","Data":"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d"} Feb 25 07:52:44 crc kubenswrapper[4749]: I0225 07:52:44.961246 4749 scope.go:117] "RemoveContainer" containerID="5693fb493b6bc8d837acdc868c7dce21222add1de538a42d69aa0445c7ca46dc" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.322002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.322357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.380482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.414979 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwfqs" podStartSLOduration=8.967226063 podStartE2EDuration="11.414951309s" podCreationTimestamp="2026-02-25 07:52:35 +0000 UTC" firstStartedPulling="2026-02-25 07:52:38.451034255 +0000 UTC m=+2111.812860275" lastFinishedPulling="2026-02-25 07:52:40.898759451 +0000 UTC m=+2114.260585521" observedRunningTime="2026-02-25 07:52:41.523141026 +0000 UTC m=+2114.884967046" watchObservedRunningTime="2026-02-25 07:52:46.414951309 +0000 UTC m=+2119.776777349" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.588844 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:46 crc kubenswrapper[4749]: I0225 07:52:46.647115 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:48 crc kubenswrapper[4749]: I0225 07:52:48.555667 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwfqs" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="registry-server" containerID="cri-o://c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d" gracePeriod=2 Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.021651 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.194116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98kbk\" (UniqueName: \"kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk\") pod \"181a6c13-2bad-4ee6-91c7-6c012f950690\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.194294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content\") pod \"181a6c13-2bad-4ee6-91c7-6c012f950690\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.194364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities\") pod \"181a6c13-2bad-4ee6-91c7-6c012f950690\" (UID: \"181a6c13-2bad-4ee6-91c7-6c012f950690\") " Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.196650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities" (OuterVolumeSpecName: "utilities") pod "181a6c13-2bad-4ee6-91c7-6c012f950690" (UID: "181a6c13-2bad-4ee6-91c7-6c012f950690"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.203081 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk" (OuterVolumeSpecName: "kube-api-access-98kbk") pod "181a6c13-2bad-4ee6-91c7-6c012f950690" (UID: "181a6c13-2bad-4ee6-91c7-6c012f950690"). InnerVolumeSpecName "kube-api-access-98kbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.277532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "181a6c13-2bad-4ee6-91c7-6c012f950690" (UID: "181a6c13-2bad-4ee6-91c7-6c012f950690"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.296511 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98kbk\" (UniqueName: \"kubernetes.io/projected/181a6c13-2bad-4ee6-91c7-6c012f950690-kube-api-access-98kbk\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.296544 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.296554 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/181a6c13-2bad-4ee6-91c7-6c012f950690-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.571397 4749 generic.go:334] "Generic (PLEG): container finished" podID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerID="c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d" exitCode=0 Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.571464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerDied","Data":"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d"} Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.571502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwfqs" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.571529 4749 scope.go:117] "RemoveContainer" containerID="c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.571512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwfqs" event={"ID":"181a6c13-2bad-4ee6-91c7-6c012f950690","Type":"ContainerDied","Data":"a9d4f93ace04470ed9cb630dcf97c2c4efa63c65066cd00a10baa59a428bacc9"} Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.604904 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.612263 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwfqs"] Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.617734 4749 scope.go:117] "RemoveContainer" containerID="e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.664455 4749 scope.go:117] "RemoveContainer" containerID="d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.687999 4749 scope.go:117] "RemoveContainer" containerID="c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d" Feb 25 07:52:49 crc kubenswrapper[4749]: E0225 07:52:49.688569 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d\": container with ID starting with c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d not found: ID does not exist" containerID="c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.688611 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d"} err="failed to get container status \"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d\": rpc error: code = NotFound desc = could not find container \"c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d\": container with ID starting with c659ee769c9726cb3c09a31412a909949c676fb654a69be2e9154833f294de9d not found: ID does not exist" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.688633 4749 scope.go:117] "RemoveContainer" containerID="e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40" Feb 25 07:52:49 crc kubenswrapper[4749]: E0225 07:52:49.689040 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40\": container with ID starting with e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40 not found: ID does not exist" containerID="e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.689095 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40"} err="failed to get container status \"e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40\": rpc error: code = NotFound desc = could not find container \"e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40\": container with ID starting with e87969fc63b01b2a68884d138cb17b5d1e13fc80f1d8b2b762eddfd249e11e40 not found: ID does not exist" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.689129 4749 scope.go:117] "RemoveContainer" containerID="d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865" Feb 25 07:52:49 crc kubenswrapper[4749]: E0225 07:52:49.689751 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865\": container with ID starting with d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865 not found: ID does not exist" containerID="d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865" Feb 25 07:52:49 crc kubenswrapper[4749]: I0225 07:52:49.689783 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865"} err="failed to get container status \"d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865\": rpc error: code = NotFound desc = could not find container \"d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865\": container with ID starting with d367a24de1bd34dd024b481964672430d4aecbed20618c305b627be0ff2de865 not found: ID does not exist" Feb 25 07:52:51 crc kubenswrapper[4749]: I0225 07:52:51.342036 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" path="/var/lib/kubelet/pods/181a6c13-2bad-4ee6-91c7-6c012f950690/volumes" Feb 25 07:52:51 crc kubenswrapper[4749]: I0225 07:52:51.671436 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:52:51 crc kubenswrapper[4749]: I0225 07:52:51.672017 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.671569 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.672300 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.672369 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.673627 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.673703 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673" gracePeriod=600 Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.907290 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673" exitCode=0 Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.907332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673"} Feb 25 07:53:21 crc kubenswrapper[4749]: I0225 07:53:21.907572 4749 scope.go:117] "RemoveContainer" containerID="cf80e6ec431ff466bb18215a8c95465546d336aea4903ea866730f6e1f5acd28" Feb 25 07:53:22 crc kubenswrapper[4749]: I0225 07:53:22.931572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74"} Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.868537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:53:45 crc kubenswrapper[4749]: E0225 07:53:45.869328 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="extract-utilities" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.869340 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="extract-utilities" Feb 25 07:53:45 crc kubenswrapper[4749]: E0225 07:53:45.869362 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="registry-server" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.869368 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="registry-server" Feb 25 07:53:45 crc kubenswrapper[4749]: E0225 07:53:45.869378 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="extract-content" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.869384 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="extract-content" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.869554 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="181a6c13-2bad-4ee6-91c7-6c012f950690" containerName="registry-server" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.870802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.887413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.999459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.999534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxz92\" (UniqueName: \"kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:45 crc kubenswrapper[4749]: I0225 07:53:45.999559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.101751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.101862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxz92\" (UniqueName: \"kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.101896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.102496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.102812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.130682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxz92\" (UniqueName: \"kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92\") pod \"redhat-operators-ztztj\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.186637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:46 crc kubenswrapper[4749]: I0225 07:53:46.653202 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:53:47 crc kubenswrapper[4749]: I0225 07:53:47.202141 4749 generic.go:334] "Generic (PLEG): container finished" podID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerID="a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc" exitCode=0 Feb 25 07:53:47 crc kubenswrapper[4749]: I0225 07:53:47.202234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerDied","Data":"a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc"} Feb 25 07:53:47 crc kubenswrapper[4749]: I0225 07:53:47.202463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerStarted","Data":"adf3a555db827f4772c4be848ca51c1553fc274aeea72463b1457b3a15b84606"} Feb 25 07:53:48 crc kubenswrapper[4749]: I0225 07:53:48.213675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerStarted","Data":"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc"} Feb 25 07:53:49 crc kubenswrapper[4749]: I0225 07:53:49.229937 4749 generic.go:334] "Generic (PLEG): container finished" podID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerID="5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc" exitCode=0 Feb 25 07:53:49 crc kubenswrapper[4749]: I0225 07:53:49.229993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerDied","Data":"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc"} Feb 25 07:53:50 crc kubenswrapper[4749]: I0225 07:53:50.244880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerStarted","Data":"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9"} Feb 25 07:53:50 crc kubenswrapper[4749]: I0225 07:53:50.274791 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ztztj" podStartSLOduration=2.566033465 podStartE2EDuration="5.274774046s" podCreationTimestamp="2026-02-25 07:53:45 +0000 UTC" firstStartedPulling="2026-02-25 07:53:47.203985438 +0000 UTC m=+2180.565811458" lastFinishedPulling="2026-02-25 07:53:49.912725989 +0000 UTC m=+2183.274552039" observedRunningTime="2026-02-25 07:53:50.269752865 +0000 UTC m=+2183.631578895" watchObservedRunningTime="2026-02-25 07:53:50.274774046 +0000 UTC m=+2183.636600086" Feb 25 07:53:56 crc kubenswrapper[4749]: I0225 07:53:56.187624 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:56 crc kubenswrapper[4749]: I0225 07:53:56.188362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:53:57 crc kubenswrapper[4749]: I0225 07:53:57.242500 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ztztj" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="registry-server" probeResult="failure" output=< Feb 25 07:53:57 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 07:53:57 crc kubenswrapper[4749]: > Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.151827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533434-qhzxx"] Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.154099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.156341 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.156727 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.157059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.183366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533434-qhzxx"] Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.316187 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnb4\" (UniqueName: \"kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4\") pod \"auto-csr-approver-29533434-qhzxx\" (UID: \"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f\") " pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.418214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnb4\" (UniqueName: \"kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4\") pod \"auto-csr-approver-29533434-qhzxx\" (UID: \"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f\") " pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.442727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnb4\" (UniqueName: \"kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4\") pod \"auto-csr-approver-29533434-qhzxx\" (UID: \"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f\") " pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.491807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:00 crc kubenswrapper[4749]: I0225 07:54:00.982438 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533434-qhzxx"] Feb 25 07:54:00 crc kubenswrapper[4749]: W0225 07:54:00.999906 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d1f1f44_25ca_48dd_9cf3_8fb69f33739f.slice/crio-5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e WatchSource:0}: Error finding container 5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e: Status 404 returned error can't find the container with id 5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e Feb 25 07:54:01 crc kubenswrapper[4749]: I0225 07:54:01.354672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" event={"ID":"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f","Type":"ContainerStarted","Data":"5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e"} Feb 25 07:54:02 crc kubenswrapper[4749]: I0225 07:54:02.364397 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" containerID="cd7f0f03c419146b9f4ba5b1915c124c4c8f9c5a39f0ea20b07f2f4364ddbef3" exitCode=0 Feb 25 07:54:02 crc kubenswrapper[4749]: I0225 07:54:02.364476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" event={"ID":"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f","Type":"ContainerDied","Data":"cd7f0f03c419146b9f4ba5b1915c124c4c8f9c5a39f0ea20b07f2f4364ddbef3"} Feb 25 07:54:03 crc kubenswrapper[4749]: I0225 07:54:03.771679 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:03 crc kubenswrapper[4749]: I0225 07:54:03.894156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hnb4\" (UniqueName: \"kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4\") pod \"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f\" (UID: \"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f\") " Feb 25 07:54:03 crc kubenswrapper[4749]: I0225 07:54:03.900846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4" (OuterVolumeSpecName: "kube-api-access-7hnb4") pod "6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" (UID: "6d1f1f44-25ca-48dd-9cf3-8fb69f33739f"). InnerVolumeSpecName "kube-api-access-7hnb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:54:03 crc kubenswrapper[4749]: I0225 07:54:03.997258 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hnb4\" (UniqueName: \"kubernetes.io/projected/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f-kube-api-access-7hnb4\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:04 crc kubenswrapper[4749]: I0225 07:54:04.386294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" event={"ID":"6d1f1f44-25ca-48dd-9cf3-8fb69f33739f","Type":"ContainerDied","Data":"5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e"} Feb 25 07:54:04 crc kubenswrapper[4749]: I0225 07:54:04.386673 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f593365046562250bd8a0bb163eeed67f710bffc8b654b89229866f4f17b06e" Feb 25 07:54:04 crc kubenswrapper[4749]: I0225 07:54:04.386366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533434-qhzxx" Feb 25 07:54:04 crc kubenswrapper[4749]: I0225 07:54:04.844501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533428-z7djp"] Feb 25 07:54:04 crc kubenswrapper[4749]: I0225 07:54:04.857998 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533428-z7djp"] Feb 25 07:54:05 crc kubenswrapper[4749]: I0225 07:54:05.341383 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118325bc-53b6-454b-b62d-3eebfe528076" path="/var/lib/kubelet/pods/118325bc-53b6-454b-b62d-3eebfe528076/volumes" Feb 25 07:54:06 crc kubenswrapper[4749]: I0225 07:54:06.277479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:54:06 crc kubenswrapper[4749]: I0225 07:54:06.346273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:54:06 crc kubenswrapper[4749]: I0225 07:54:06.528684 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.411142 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ztztj" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="registry-server" containerID="cri-o://94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9" gracePeriod=2 Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.887859 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.971189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxz92\" (UniqueName: \"kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92\") pod \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.971250 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities\") pod \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.971364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content\") pod \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\" (UID: \"da9bd239-9dc7-44fc-a30c-19b04044a5a7\") " Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.972476 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities" (OuterVolumeSpecName: "utilities") pod "da9bd239-9dc7-44fc-a30c-19b04044a5a7" (UID: "da9bd239-9dc7-44fc-a30c-19b04044a5a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:54:07 crc kubenswrapper[4749]: I0225 07:54:07.977347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92" (OuterVolumeSpecName: "kube-api-access-wxz92") pod "da9bd239-9dc7-44fc-a30c-19b04044a5a7" (UID: "da9bd239-9dc7-44fc-a30c-19b04044a5a7"). InnerVolumeSpecName "kube-api-access-wxz92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.073667 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxz92\" (UniqueName: \"kubernetes.io/projected/da9bd239-9dc7-44fc-a30c-19b04044a5a7-kube-api-access-wxz92\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.073711 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.097222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da9bd239-9dc7-44fc-a30c-19b04044a5a7" (UID: "da9bd239-9dc7-44fc-a30c-19b04044a5a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.178317 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da9bd239-9dc7-44fc-a30c-19b04044a5a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.427364 4749 generic.go:334] "Generic (PLEG): container finished" podID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerID="94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9" exitCode=0 Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.427417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerDied","Data":"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9"} Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.427451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ztztj" event={"ID":"da9bd239-9dc7-44fc-a30c-19b04044a5a7","Type":"ContainerDied","Data":"adf3a555db827f4772c4be848ca51c1553fc274aeea72463b1457b3a15b84606"} Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.427465 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ztztj" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.427474 4749 scope.go:117] "RemoveContainer" containerID="94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.448096 4749 scope.go:117] "RemoveContainer" containerID="5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.486443 4749 scope.go:117] "RemoveContainer" containerID="a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.490911 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.501816 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ztztj"] Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.534670 4749 scope.go:117] "RemoveContainer" containerID="94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9" Feb 25 07:54:08 crc kubenswrapper[4749]: E0225 07:54:08.535118 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9\": container with ID starting with 94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9 not found: ID does not exist" containerID="94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.535173 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9"} err="failed to get container status \"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9\": rpc error: code = NotFound desc = could not find container \"94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9\": container with ID starting with 94e395eab6b719b6928b5ad766908dfa4ac0ce363f0d79d5522ef9f28fd67bd9 not found: ID does not exist" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.535251 4749 scope.go:117] "RemoveContainer" containerID="5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc" Feb 25 07:54:08 crc kubenswrapper[4749]: E0225 07:54:08.535699 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc\": container with ID starting with 5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc not found: ID does not exist" containerID="5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.535742 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc"} err="failed to get container status \"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc\": rpc error: code = NotFound desc = could not find container \"5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc\": container with ID starting with 5383dc7ab4120ee3b81fbe5840ee6014df01186a0b247a502d7181d02e72f4dc not found: ID does not exist" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.535768 4749 scope.go:117] "RemoveContainer" containerID="a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc" Feb 25 07:54:08 crc kubenswrapper[4749]: E0225 07:54:08.539485 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc\": container with ID starting with a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc not found: ID does not exist" containerID="a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc" Feb 25 07:54:08 crc kubenswrapper[4749]: I0225 07:54:08.539581 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc"} err="failed to get container status \"a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc\": rpc error: code = NotFound desc = could not find container \"a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc\": container with ID starting with a16d9681a672c3c4bbb927a010aa5b57b6e22561e6392268f6a95fc3d72098dc not found: ID does not exist" Feb 25 07:54:09 crc kubenswrapper[4749]: I0225 07:54:09.344177 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" path="/var/lib/kubelet/pods/da9bd239-9dc7-44fc-a30c-19b04044a5a7/volumes" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.468266 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:32 crc kubenswrapper[4749]: E0225 07:54:32.469446 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="registry-server" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.469469 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="registry-server" Feb 25 07:54:32 crc kubenswrapper[4749]: E0225 07:54:32.469510 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" containerName="oc" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.469524 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" containerName="oc" Feb 25 07:54:32 crc kubenswrapper[4749]: E0225 07:54:32.469550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="extract-content" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.469562 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="extract-content" Feb 25 07:54:32 crc kubenswrapper[4749]: E0225 07:54:32.469632 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="extract-utilities" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.469646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="extract-utilities" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.469963 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" containerName="oc" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.470014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9bd239-9dc7-44fc-a30c-19b04044a5a7" containerName="registry-server" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.472239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.488178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.634205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.634289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knksh\" (UniqueName: \"kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.634356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.735926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.736377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knksh\" (UniqueName: \"kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.736586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.736875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.737186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.761708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knksh\" (UniqueName: \"kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh\") pod \"redhat-marketplace-nkrft\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:32 crc kubenswrapper[4749]: I0225 07:54:32.841705 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:33 crc kubenswrapper[4749]: I0225 07:54:33.348740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:33 crc kubenswrapper[4749]: I0225 07:54:33.735979 4749 generic.go:334] "Generic (PLEG): container finished" podID="884beb79-2276-4749-83d8-a9814f2cbcce" containerID="7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68" exitCode=0 Feb 25 07:54:33 crc kubenswrapper[4749]: I0225 07:54:33.736037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerDied","Data":"7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68"} Feb 25 07:54:33 crc kubenswrapper[4749]: I0225 07:54:33.736070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerStarted","Data":"5ab93c88de0e415b73e6558c98c64e17338437aa45cf580e50d34c336ef0f626"} Feb 25 07:54:34 crc kubenswrapper[4749]: I0225 07:54:34.754130 4749 generic.go:334] "Generic (PLEG): container finished" podID="884beb79-2276-4749-83d8-a9814f2cbcce" containerID="7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797" exitCode=0 Feb 25 07:54:34 crc kubenswrapper[4749]: I0225 07:54:34.754230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerDied","Data":"7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797"} Feb 25 07:54:36 crc kubenswrapper[4749]: I0225 07:54:36.779567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerStarted","Data":"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93"} Feb 25 07:54:36 crc kubenswrapper[4749]: I0225 07:54:36.808038 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nkrft" podStartSLOduration=2.5911510250000003 podStartE2EDuration="4.808017663s" podCreationTimestamp="2026-02-25 07:54:32 +0000 UTC" firstStartedPulling="2026-02-25 07:54:33.738081174 +0000 UTC m=+2227.099907194" lastFinishedPulling="2026-02-25 07:54:35.954947782 +0000 UTC m=+2229.316773832" observedRunningTime="2026-02-25 07:54:36.79880973 +0000 UTC m=+2230.160635750" watchObservedRunningTime="2026-02-25 07:54:36.808017663 +0000 UTC m=+2230.169843683" Feb 25 07:54:42 crc kubenswrapper[4749]: I0225 07:54:42.842195 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:42 crc kubenswrapper[4749]: I0225 07:54:42.842742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:42 crc kubenswrapper[4749]: I0225 07:54:42.952075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:43 crc kubenswrapper[4749]: I0225 07:54:43.037524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:43 crc kubenswrapper[4749]: I0225 07:54:43.205073 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:44 crc kubenswrapper[4749]: I0225 07:54:44.883397 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nkrft" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="registry-server" containerID="cri-o://d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93" gracePeriod=2 Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.085085 4749 scope.go:117] "RemoveContainer" containerID="363e28066c93ed9fc9ef31a14f41d969455d80f95e4a678127dea21420af0cc8" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.459889 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.617244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knksh\" (UniqueName: \"kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh\") pod \"884beb79-2276-4749-83d8-a9814f2cbcce\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.617351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content\") pod \"884beb79-2276-4749-83d8-a9814f2cbcce\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.617618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities\") pod \"884beb79-2276-4749-83d8-a9814f2cbcce\" (UID: \"884beb79-2276-4749-83d8-a9814f2cbcce\") " Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.618251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities" (OuterVolumeSpecName: "utilities") pod "884beb79-2276-4749-83d8-a9814f2cbcce" (UID: "884beb79-2276-4749-83d8-a9814f2cbcce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.626171 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh" (OuterVolumeSpecName: "kube-api-access-knksh") pod "884beb79-2276-4749-83d8-a9814f2cbcce" (UID: "884beb79-2276-4749-83d8-a9814f2cbcce"). InnerVolumeSpecName "kube-api-access-knksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.638649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "884beb79-2276-4749-83d8-a9814f2cbcce" (UID: "884beb79-2276-4749-83d8-a9814f2cbcce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.721423 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.721931 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knksh\" (UniqueName: \"kubernetes.io/projected/884beb79-2276-4749-83d8-a9814f2cbcce-kube-api-access-knksh\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.721963 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884beb79-2276-4749-83d8-a9814f2cbcce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.897508 4749 generic.go:334] "Generic (PLEG): container finished" podID="884beb79-2276-4749-83d8-a9814f2cbcce" containerID="d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93" exitCode=0 Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.897551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerDied","Data":"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93"} Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.897578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nkrft" event={"ID":"884beb79-2276-4749-83d8-a9814f2cbcce","Type":"ContainerDied","Data":"5ab93c88de0e415b73e6558c98c64e17338437aa45cf580e50d34c336ef0f626"} Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.897620 4749 scope.go:117] "RemoveContainer" containerID="d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.897617 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nkrft" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.927196 4749 scope.go:117] "RemoveContainer" containerID="7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.934078 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.944213 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nkrft"] Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.959001 4749 scope.go:117] "RemoveContainer" containerID="7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.981896 4749 scope.go:117] "RemoveContainer" containerID="d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93" Feb 25 07:54:45 crc kubenswrapper[4749]: E0225 07:54:45.982528 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93\": container with ID starting with d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93 not found: ID does not exist" containerID="d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.982627 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93"} err="failed to get container status \"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93\": rpc error: code = NotFound desc = could not find container \"d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93\": container with ID starting with d3fd1e37d4819a4bff49c6aecebfbd8f125979ac276740666bb79afc97297f93 not found: ID does not exist" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.982669 4749 scope.go:117] "RemoveContainer" containerID="7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797" Feb 25 07:54:45 crc kubenswrapper[4749]: E0225 07:54:45.983194 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797\": container with ID starting with 7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797 not found: ID does not exist" containerID="7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.983234 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797"} err="failed to get container status \"7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797\": rpc error: code = NotFound desc = could not find container \"7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797\": container with ID starting with 7ce6e423bf9ee5a3269baf0a1ac91ce400973c92994966fd81c8e2b00f535797 not found: ID does not exist" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.983263 4749 scope.go:117] "RemoveContainer" containerID="7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68" Feb 25 07:54:45 crc kubenswrapper[4749]: E0225 07:54:45.983761 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68\": container with ID starting with 7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68 not found: ID does not exist" containerID="7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68" Feb 25 07:54:45 crc kubenswrapper[4749]: I0225 07:54:45.983803 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68"} err="failed to get container status \"7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68\": rpc error: code = NotFound desc = could not find container \"7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68\": container with ID starting with 7d385178a98978a57155eb6f3ceab03ca745eb32f3de4d7719c198f349212c68 not found: ID does not exist" Feb 25 07:54:47 crc kubenswrapper[4749]: I0225 07:54:47.342794 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" path="/var/lib/kubelet/pods/884beb79-2276-4749-83d8-a9814f2cbcce/volumes" Feb 25 07:55:21 crc kubenswrapper[4749]: I0225 07:55:21.671889 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:55:21 crc kubenswrapper[4749]: I0225 07:55:21.672520 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:55:51 crc kubenswrapper[4749]: I0225 07:55:51.671400 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:55:51 crc kubenswrapper[4749]: I0225 07:55:51.674682 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.163330 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533436-gdfsc"] Feb 25 07:56:00 crc kubenswrapper[4749]: E0225 07:56:00.164269 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="extract-utilities" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.164283 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="extract-utilities" Feb 25 07:56:00 crc kubenswrapper[4749]: E0225 07:56:00.164305 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="extract-content" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.164312 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="extract-content" Feb 25 07:56:00 crc kubenswrapper[4749]: E0225 07:56:00.164335 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="registry-server" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.164343 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="registry-server" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.164582 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="884beb79-2276-4749-83d8-a9814f2cbcce" containerName="registry-server" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.165287 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.169749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.170276 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.176048 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.195623 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533436-gdfsc"] Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.301663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx74r\" (UniqueName: \"kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r\") pod \"auto-csr-approver-29533436-gdfsc\" (UID: \"078b859c-384b-4651-857d-f6337e808e8c\") " pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.403962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx74r\" (UniqueName: \"kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r\") pod \"auto-csr-approver-29533436-gdfsc\" (UID: \"078b859c-384b-4651-857d-f6337e808e8c\") " pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.439305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx74r\" (UniqueName: \"kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r\") pod \"auto-csr-approver-29533436-gdfsc\" (UID: \"078b859c-384b-4651-857d-f6337e808e8c\") " pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:00 crc kubenswrapper[4749]: I0225 07:56:00.489156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:01 crc kubenswrapper[4749]: I0225 07:56:01.017220 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533436-gdfsc"] Feb 25 07:56:01 crc kubenswrapper[4749]: I0225 07:56:01.783626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" event={"ID":"078b859c-384b-4651-857d-f6337e808e8c","Type":"ContainerStarted","Data":"3659ce74aac3fab87eec4613743d085aea25cebd60f05d3a098f98a97acad48e"} Feb 25 07:56:02 crc kubenswrapper[4749]: I0225 07:56:02.795852 4749 generic.go:334] "Generic (PLEG): container finished" podID="078b859c-384b-4651-857d-f6337e808e8c" containerID="a2fe4adc72ac95efe71e1f00fd9518b18143acf1d1828ef38f9155cf17242757" exitCode=0 Feb 25 07:56:02 crc kubenswrapper[4749]: I0225 07:56:02.795940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" event={"ID":"078b859c-384b-4651-857d-f6337e808e8c","Type":"ContainerDied","Data":"a2fe4adc72ac95efe71e1f00fd9518b18143acf1d1828ef38f9155cf17242757"} Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.180324 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.283825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx74r\" (UniqueName: \"kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r\") pod \"078b859c-384b-4651-857d-f6337e808e8c\" (UID: \"078b859c-384b-4651-857d-f6337e808e8c\") " Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.292845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r" (OuterVolumeSpecName: "kube-api-access-fx74r") pod "078b859c-384b-4651-857d-f6337e808e8c" (UID: "078b859c-384b-4651-857d-f6337e808e8c"). InnerVolumeSpecName "kube-api-access-fx74r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.386815 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx74r\" (UniqueName: \"kubernetes.io/projected/078b859c-384b-4651-857d-f6337e808e8c-kube-api-access-fx74r\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.820939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" event={"ID":"078b859c-384b-4651-857d-f6337e808e8c","Type":"ContainerDied","Data":"3659ce74aac3fab87eec4613743d085aea25cebd60f05d3a098f98a97acad48e"} Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.821013 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3659ce74aac3fab87eec4613743d085aea25cebd60f05d3a098f98a97acad48e" Feb 25 07:56:04 crc kubenswrapper[4749]: I0225 07:56:04.821084 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533436-gdfsc" Feb 25 07:56:05 crc kubenswrapper[4749]: I0225 07:56:05.266915 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533430-sxqcb"] Feb 25 07:56:05 crc kubenswrapper[4749]: I0225 07:56:05.275375 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533430-sxqcb"] Feb 25 07:56:05 crc kubenswrapper[4749]: I0225 07:56:05.338226 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3461bb05-8639-4ffd-bb8d-7e5fdb09ed72" path="/var/lib/kubelet/pods/3461bb05-8639-4ffd-bb8d-7e5fdb09ed72/volumes" Feb 25 07:56:16 crc kubenswrapper[4749]: I0225 07:56:16.959935 4749 generic.go:334] "Generic (PLEG): container finished" podID="04b421fd-689e-4212-85a9-ffaecfe63fbe" containerID="f506f6319c968a756b90235936ff13981a96dea3827a92a26c55e8d8834556df" exitCode=0 Feb 25 07:56:16 crc kubenswrapper[4749]: I0225 07:56:16.960792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" event={"ID":"04b421fd-689e-4212-85a9-ffaecfe63fbe","Type":"ContainerDied","Data":"f506f6319c968a756b90235936ff13981a96dea3827a92a26c55e8d8834556df"} Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.567623 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.603045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76dt\" (UniqueName: \"kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt\") pod \"04b421fd-689e-4212-85a9-ffaecfe63fbe\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.603189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle\") pod \"04b421fd-689e-4212-85a9-ffaecfe63fbe\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.603408 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory\") pod \"04b421fd-689e-4212-85a9-ffaecfe63fbe\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.603465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0\") pod \"04b421fd-689e-4212-85a9-ffaecfe63fbe\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.603537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam\") pod \"04b421fd-689e-4212-85a9-ffaecfe63fbe\" (UID: \"04b421fd-689e-4212-85a9-ffaecfe63fbe\") " Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.616498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt" (OuterVolumeSpecName: "kube-api-access-t76dt") pod "04b421fd-689e-4212-85a9-ffaecfe63fbe" (UID: "04b421fd-689e-4212-85a9-ffaecfe63fbe"). InnerVolumeSpecName "kube-api-access-t76dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.631954 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "04b421fd-689e-4212-85a9-ffaecfe63fbe" (UID: "04b421fd-689e-4212-85a9-ffaecfe63fbe"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.643581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04b421fd-689e-4212-85a9-ffaecfe63fbe" (UID: "04b421fd-689e-4212-85a9-ffaecfe63fbe"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.659335 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory" (OuterVolumeSpecName: "inventory") pod "04b421fd-689e-4212-85a9-ffaecfe63fbe" (UID: "04b421fd-689e-4212-85a9-ffaecfe63fbe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.661181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "04b421fd-689e-4212-85a9-ffaecfe63fbe" (UID: "04b421fd-689e-4212-85a9-ffaecfe63fbe"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.705816 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.705849 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.705861 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.705870 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76dt\" (UniqueName: \"kubernetes.io/projected/04b421fd-689e-4212-85a9-ffaecfe63fbe-kube-api-access-t76dt\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.705878 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b421fd-689e-4212-85a9-ffaecfe63fbe-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.989976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" event={"ID":"04b421fd-689e-4212-85a9-ffaecfe63fbe","Type":"ContainerDied","Data":"75161fb1d698a4c7d700aa442d0e79f9033d50fa24576f256131e4bfd2b73242"} Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.990021 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75161fb1d698a4c7d700aa442d0e79f9033d50fa24576f256131e4bfd2b73242" Feb 25 07:56:18 crc kubenswrapper[4749]: I0225 07:56:18.990062 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.167438 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2"] Feb 25 07:56:19 crc kubenswrapper[4749]: E0225 07:56:19.168411 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="078b859c-384b-4651-857d-f6337e808e8c" containerName="oc" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.168441 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="078b859c-384b-4651-857d-f6337e808e8c" containerName="oc" Feb 25 07:56:19 crc kubenswrapper[4749]: E0225 07:56:19.168513 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b421fd-689e-4212-85a9-ffaecfe63fbe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.168531 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b421fd-689e-4212-85a9-ffaecfe63fbe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.168904 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b421fd-689e-4212-85a9-ffaecfe63fbe" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.168937 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="078b859c-384b-4651-857d-f6337e808e8c" containerName="oc" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.169977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.174022 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.174337 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.174896 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.175140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.175373 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.175633 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.176001 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.189768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2"] Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.216812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.216928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217004 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjsl\" (UniqueName: \"kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217480 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.217532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.318976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjsl\" (UniqueName: \"kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.319883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.321569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.326482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.327231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.328482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.328621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.329375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.330770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.330902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.331843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.337075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.341401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjsl\" (UniqueName: \"kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l2dl2\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.495455 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:56:19 crc kubenswrapper[4749]: I0225 07:56:19.907121 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2"] Feb 25 07:56:20 crc kubenswrapper[4749]: I0225 07:56:20.002300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" event={"ID":"9b501ea6-b5f9-497b-9da6-072e7a0fde7a","Type":"ContainerStarted","Data":"5d67954dc648dab058f488b3b0094d7e3caa5b7978385373be7b84dc9b9b31dd"} Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.014876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" event={"ID":"9b501ea6-b5f9-497b-9da6-072e7a0fde7a","Type":"ContainerStarted","Data":"8ead41a6affafbef41860a1f090112fd6783965dfeb01722ebce19361a390853"} Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.047132 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" podStartSLOduration=1.475335091 podStartE2EDuration="2.047082086s" podCreationTimestamp="2026-02-25 07:56:19 +0000 UTC" firstStartedPulling="2026-02-25 07:56:19.921643214 +0000 UTC m=+2333.283469234" lastFinishedPulling="2026-02-25 07:56:20.493390169 +0000 UTC m=+2333.855216229" observedRunningTime="2026-02-25 07:56:21.043803396 +0000 UTC m=+2334.405629436" watchObservedRunningTime="2026-02-25 07:56:21.047082086 +0000 UTC m=+2334.408908136" Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.671691 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.672094 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.672158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.673344 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 07:56:21 crc kubenswrapper[4749]: I0225 07:56:21.673450 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" gracePeriod=600 Feb 25 07:56:21 crc kubenswrapper[4749]: E0225 07:56:21.802737 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:56:22 crc kubenswrapper[4749]: I0225 07:56:22.031191 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" exitCode=0 Feb 25 07:56:22 crc kubenswrapper[4749]: I0225 07:56:22.032712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74"} Feb 25 07:56:22 crc kubenswrapper[4749]: I0225 07:56:22.032773 4749 scope.go:117] "RemoveContainer" containerID="98443602e7947cb4a6567fd282fa48802b593e5fcba97afe2209236574d08673" Feb 25 07:56:22 crc kubenswrapper[4749]: I0225 07:56:22.033300 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:56:22 crc kubenswrapper[4749]: E0225 07:56:22.033719 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:56:36 crc kubenswrapper[4749]: I0225 07:56:36.323879 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:56:36 crc kubenswrapper[4749]: E0225 07:56:36.324980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:56:45 crc kubenswrapper[4749]: I0225 07:56:45.226034 4749 scope.go:117] "RemoveContainer" containerID="719070ce472a459a1ad95b1b86000193936c1d21bb1b30fb00e97c567d49ff0c" Feb 25 07:56:47 crc kubenswrapper[4749]: I0225 07:56:47.336371 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:56:47 crc kubenswrapper[4749]: E0225 07:56:47.337262 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:57:02 crc kubenswrapper[4749]: I0225 07:57:02.322675 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:57:02 crc kubenswrapper[4749]: E0225 07:57:02.323545 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:57:15 crc kubenswrapper[4749]: I0225 07:57:15.322415 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:57:15 crc kubenswrapper[4749]: E0225 07:57:15.323339 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:57:30 crc kubenswrapper[4749]: I0225 07:57:30.324036 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:57:30 crc kubenswrapper[4749]: E0225 07:57:30.325087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:57:41 crc kubenswrapper[4749]: I0225 07:57:41.322563 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:57:41 crc kubenswrapper[4749]: E0225 07:57:41.323435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:57:52 crc kubenswrapper[4749]: I0225 07:57:52.323148 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:57:52 crc kubenswrapper[4749]: E0225 07:57:52.324181 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.155053 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533438-d6qkv"] Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.156793 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.159823 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.159870 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.160051 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.172511 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533438-d6qkv"] Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.343623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt5xz\" (UniqueName: \"kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz\") pod \"auto-csr-approver-29533438-d6qkv\" (UID: \"181c0d3e-7868-480b-b159-f0315cb19dd4\") " pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.445452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt5xz\" (UniqueName: \"kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz\") pod \"auto-csr-approver-29533438-d6qkv\" (UID: \"181c0d3e-7868-480b-b159-f0315cb19dd4\") " pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.464827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt5xz\" (UniqueName: \"kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz\") pod \"auto-csr-approver-29533438-d6qkv\" (UID: \"181c0d3e-7868-480b-b159-f0315cb19dd4\") " pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.496386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.965474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533438-d6qkv"] Feb 25 07:58:00 crc kubenswrapper[4749]: W0225 07:58:00.983053 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod181c0d3e_7868_480b_b159_f0315cb19dd4.slice/crio-db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c WatchSource:0}: Error finding container db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c: Status 404 returned error can't find the container with id db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c Feb 25 07:58:00 crc kubenswrapper[4749]: I0225 07:58:00.987523 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 07:58:01 crc kubenswrapper[4749]: I0225 07:58:01.642945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" event={"ID":"181c0d3e-7868-480b-b159-f0315cb19dd4","Type":"ContainerStarted","Data":"db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c"} Feb 25 07:58:02 crc kubenswrapper[4749]: I0225 07:58:02.656670 4749 generic.go:334] "Generic (PLEG): container finished" podID="181c0d3e-7868-480b-b159-f0315cb19dd4" containerID="0d8c07bba520694800d7dac8c043f1e44e5e7eb54dba35dd6ce7e49463724141" exitCode=0 Feb 25 07:58:02 crc kubenswrapper[4749]: I0225 07:58:02.656793 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" event={"ID":"181c0d3e-7868-480b-b159-f0315cb19dd4","Type":"ContainerDied","Data":"0d8c07bba520694800d7dac8c043f1e44e5e7eb54dba35dd6ce7e49463724141"} Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.085779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.231189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt5xz\" (UniqueName: \"kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz\") pod \"181c0d3e-7868-480b-b159-f0315cb19dd4\" (UID: \"181c0d3e-7868-480b-b159-f0315cb19dd4\") " Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.247238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz" (OuterVolumeSpecName: "kube-api-access-pt5xz") pod "181c0d3e-7868-480b-b159-f0315cb19dd4" (UID: "181c0d3e-7868-480b-b159-f0315cb19dd4"). InnerVolumeSpecName "kube-api-access-pt5xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.333840 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt5xz\" (UniqueName: \"kubernetes.io/projected/181c0d3e-7868-480b-b159-f0315cb19dd4-kube-api-access-pt5xz\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.681807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" event={"ID":"181c0d3e-7868-480b-b159-f0315cb19dd4","Type":"ContainerDied","Data":"db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c"} Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.681874 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5b6e5bce7a45f6f866df219ab771914dbb6da38188a29ceb0aecc75186f35c" Feb 25 07:58:04 crc kubenswrapper[4749]: I0225 07:58:04.681902 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533438-d6qkv" Feb 25 07:58:05 crc kubenswrapper[4749]: I0225 07:58:05.174287 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533432-jt8z4"] Feb 25 07:58:05 crc kubenswrapper[4749]: I0225 07:58:05.184653 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533432-jt8z4"] Feb 25 07:58:05 crc kubenswrapper[4749]: I0225 07:58:05.337161 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d09c41-2f15-4fde-92a4-2c6a759e7173" path="/var/lib/kubelet/pods/f2d09c41-2f15-4fde-92a4-2c6a759e7173/volumes" Feb 25 07:58:06 crc kubenswrapper[4749]: I0225 07:58:06.323334 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:58:06 crc kubenswrapper[4749]: E0225 07:58:06.324054 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:58:18 crc kubenswrapper[4749]: I0225 07:58:18.323530 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:58:18 crc kubenswrapper[4749]: E0225 07:58:18.326056 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:58:33 crc kubenswrapper[4749]: I0225 07:58:33.322691 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:58:33 crc kubenswrapper[4749]: E0225 07:58:33.324011 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:58:45 crc kubenswrapper[4749]: I0225 07:58:45.352651 4749 scope.go:117] "RemoveContainer" containerID="f50f1aad5fa998d067df9f1bfedd4f87bb53314d16c73f67ce09da4242d4e6d9" Feb 25 07:58:46 crc kubenswrapper[4749]: I0225 07:58:46.322536 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:58:46 crc kubenswrapper[4749]: E0225 07:58:46.323373 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.165762 4749 generic.go:334] "Generic (PLEG): container finished" podID="9b501ea6-b5f9-497b-9da6-072e7a0fde7a" containerID="8ead41a6affafbef41860a1f090112fd6783965dfeb01722ebce19361a390853" exitCode=0 Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.165889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" event={"ID":"9b501ea6-b5f9-497b-9da6-072e7a0fde7a","Type":"ContainerDied","Data":"8ead41a6affafbef41860a1f090112fd6783965dfeb01722ebce19361a390853"} Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.675112 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-67nt9"] Feb 25 07:58:49 crc kubenswrapper[4749]: E0225 07:58:49.675767 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181c0d3e-7868-480b-b159-f0315cb19dd4" containerName="oc" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.675799 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="181c0d3e-7868-480b-b159-f0315cb19dd4" containerName="oc" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.676224 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="181c0d3e-7868-480b-b159-f0315cb19dd4" containerName="oc" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.678521 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.700935 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67nt9"] Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.787193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-catalog-content\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.787568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5rql\" (UniqueName: \"kubernetes.io/projected/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-kube-api-access-m5rql\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.787663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-utilities\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.890111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5rql\" (UniqueName: \"kubernetes.io/projected/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-kube-api-access-m5rql\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.890200 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-utilities\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.890297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-catalog-content\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.891246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-utilities\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.891716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-catalog-content\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:49 crc kubenswrapper[4749]: I0225 07:58:49.920497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5rql\" (UniqueName: \"kubernetes.io/projected/9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc-kube-api-access-m5rql\") pod \"community-operators-67nt9\" (UID: \"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc\") " pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:50 crc kubenswrapper[4749]: I0225 07:58:50.003561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.060042 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67nt9"] Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.205280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67nt9" event={"ID":"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc","Type":"ContainerStarted","Data":"2fcc301c0c5e7acebc593a5f9bc14b846c6d9d8e3ead96824810d215a520ef4e"} Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.207807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" event={"ID":"9b501ea6-b5f9-497b-9da6-072e7a0fde7a","Type":"ContainerDied","Data":"5d67954dc648dab058f488b3b0094d7e3caa5b7978385373be7b84dc9b9b31dd"} Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.208099 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d67954dc648dab058f488b3b0094d7e3caa5b7978385373be7b84dc9b9b31dd" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.254255 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418142 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmjsl\" (UniqueName: \"kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.418993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1\") pod \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\" (UID: \"9b501ea6-b5f9-497b-9da6-072e7a0fde7a\") " Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.425088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl" (OuterVolumeSpecName: "kube-api-access-dmjsl") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "kube-api-access-dmjsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.430249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.460990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.465132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.472367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.477332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.489109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.489778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory" (OuterVolumeSpecName: "inventory") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.490384 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.495677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.500890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9b501ea6-b5f9-497b-9da6-072e7a0fde7a" (UID: "9b501ea6-b5f9-497b-9da6-072e7a0fde7a"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521300 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521367 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521387 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521405 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmjsl\" (UniqueName: \"kubernetes.io/projected/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-kube-api-access-dmjsl\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521423 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521441 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521459 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521476 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521494 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521514 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:51 crc kubenswrapper[4749]: I0225 07:58:51.521531 4749 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9b501ea6-b5f9-497b-9da6-072e7a0fde7a-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.224813 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc" containerID="c31f10a250ac941940c1271d69ead14a3a4faa0a2ee22375180e49e3ac20560a" exitCode=0 Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.225114 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l2dl2" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.225222 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67nt9" event={"ID":"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc","Type":"ContainerDied","Data":"c31f10a250ac941940c1271d69ead14a3a4faa0a2ee22375180e49e3ac20560a"} Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.394068 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq"] Feb 25 07:58:52 crc kubenswrapper[4749]: E0225 07:58:52.394496 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b501ea6-b5f9-497b-9da6-072e7a0fde7a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.394521 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b501ea6-b5f9-497b-9da6-072e7a0fde7a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.394864 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b501ea6-b5f9-497b-9da6-072e7a0fde7a" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.396065 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.398531 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.398535 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.399031 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-m5tnb" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.400012 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.400517 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.417088 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq"] Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.542793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvq9\" (UniqueName: \"kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.542853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.542978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.543036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.543203 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.543357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.543552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.645701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.645836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvq9\" (UniqueName: \"kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.645875 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.645919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.645955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.646069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.646166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.652636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.654115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.654327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.654953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.656587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.659046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.674506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvq9\" (UniqueName: \"kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:52 crc kubenswrapper[4749]: I0225 07:58:52.734547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 07:58:53 crc kubenswrapper[4749]: I0225 07:58:53.303489 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq"] Feb 25 07:58:53 crc kubenswrapper[4749]: W0225 07:58:53.309553 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d17a7c_1c9d_47bc_818d_c2f567dfe075.slice/crio-63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb WatchSource:0}: Error finding container 63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb: Status 404 returned error can't find the container with id 63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb Feb 25 07:58:54 crc kubenswrapper[4749]: I0225 07:58:54.250926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" event={"ID":"18d17a7c-1c9d-47bc-818d-c2f567dfe075","Type":"ContainerStarted","Data":"3527de6d4f863b0fa9040bd40f1d4f334df397b30f338fd70dea28c025fe5e95"} Feb 25 07:58:54 crc kubenswrapper[4749]: I0225 07:58:54.251341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" event={"ID":"18d17a7c-1c9d-47bc-818d-c2f567dfe075","Type":"ContainerStarted","Data":"63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb"} Feb 25 07:58:54 crc kubenswrapper[4749]: I0225 07:58:54.275860 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" podStartSLOduration=1.826109076 podStartE2EDuration="2.27584073s" podCreationTimestamp="2026-02-25 07:58:52 +0000 UTC" firstStartedPulling="2026-02-25 07:58:53.312380109 +0000 UTC m=+2486.674206129" lastFinishedPulling="2026-02-25 07:58:53.762111763 +0000 UTC m=+2487.123937783" observedRunningTime="2026-02-25 07:58:54.273471082 +0000 UTC m=+2487.635297102" watchObservedRunningTime="2026-02-25 07:58:54.27584073 +0000 UTC m=+2487.637666750" Feb 25 07:58:56 crc kubenswrapper[4749]: I0225 07:58:56.277163 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc" containerID="987045f321a238673215ec8a20fd96501b756c779dd9d044e0eb410f7610b820" exitCode=0 Feb 25 07:58:56 crc kubenswrapper[4749]: I0225 07:58:56.277255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67nt9" event={"ID":"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc","Type":"ContainerDied","Data":"987045f321a238673215ec8a20fd96501b756c779dd9d044e0eb410f7610b820"} Feb 25 07:58:57 crc kubenswrapper[4749]: I0225 07:58:57.290285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67nt9" event={"ID":"9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc","Type":"ContainerStarted","Data":"f6a7f736b3d43ba756571be6f20f2bb5185e1070130c78876ac7bc1ea110708f"} Feb 25 07:58:57 crc kubenswrapper[4749]: I0225 07:58:57.319901 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-67nt9" podStartSLOduration=3.881238435 podStartE2EDuration="8.319874834s" podCreationTimestamp="2026-02-25 07:58:49 +0000 UTC" firstStartedPulling="2026-02-25 07:58:52.236878837 +0000 UTC m=+2485.598704857" lastFinishedPulling="2026-02-25 07:58:56.675515206 +0000 UTC m=+2490.037341256" observedRunningTime="2026-02-25 07:58:57.308355514 +0000 UTC m=+2490.670181544" watchObservedRunningTime="2026-02-25 07:58:57.319874834 +0000 UTC m=+2490.681700884" Feb 25 07:59:00 crc kubenswrapper[4749]: I0225 07:59:00.005399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:59:00 crc kubenswrapper[4749]: I0225 07:59:00.006105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:59:00 crc kubenswrapper[4749]: I0225 07:59:00.090214 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:59:00 crc kubenswrapper[4749]: I0225 07:59:00.322715 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:59:00 crc kubenswrapper[4749]: E0225 07:59:00.323134 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.092477 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-67nt9" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.194805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67nt9"] Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.251015 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.251297 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv24g" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="registry-server" containerID="cri-o://d9ef2253dccea103d56e10bf7cd7a8e4e780273fbcc4482cbdf2c737ac032c0d" gracePeriod=2 Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.434944 4749 generic.go:334] "Generic (PLEG): container finished" podID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerID="d9ef2253dccea103d56e10bf7cd7a8e4e780273fbcc4482cbdf2c737ac032c0d" exitCode=0 Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.435024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerDied","Data":"d9ef2253dccea103d56e10bf7cd7a8e4e780273fbcc4482cbdf2c737ac032c0d"} Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.780952 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.864017 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mdfd\" (UniqueName: \"kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd\") pod \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.864148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities\") pod \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.864256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content\") pod \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\" (UID: \"45225dff-f438-45e3-bccb-bf2c2d52f4e4\") " Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.864528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities" (OuterVolumeSpecName: "utilities") pod "45225dff-f438-45e3-bccb-bf2c2d52f4e4" (UID: "45225dff-f438-45e3-bccb-bf2c2d52f4e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.865010 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.880792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd" (OuterVolumeSpecName: "kube-api-access-6mdfd") pod "45225dff-f438-45e3-bccb-bf2c2d52f4e4" (UID: "45225dff-f438-45e3-bccb-bf2c2d52f4e4"). InnerVolumeSpecName "kube-api-access-6mdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.923097 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45225dff-f438-45e3-bccb-bf2c2d52f4e4" (UID: "45225dff-f438-45e3-bccb-bf2c2d52f4e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.966524 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45225dff-f438-45e3-bccb-bf2c2d52f4e4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 07:59:10 crc kubenswrapper[4749]: I0225 07:59:10.966929 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mdfd\" (UniqueName: \"kubernetes.io/projected/45225dff-f438-45e3-bccb-bf2c2d52f4e4-kube-api-access-6mdfd\") on node \"crc\" DevicePath \"\"" Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.446614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv24g" event={"ID":"45225dff-f438-45e3-bccb-bf2c2d52f4e4","Type":"ContainerDied","Data":"70f63382c5f0899be190de8f3d6a4d82b69b3714aa716f69b6de06da5fba1364"} Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.446680 4749 scope.go:117] "RemoveContainer" containerID="d9ef2253dccea103d56e10bf7cd7a8e4e780273fbcc4482cbdf2c737ac032c0d" Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.446697 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv24g" Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.480274 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.480990 4749 scope.go:117] "RemoveContainer" containerID="256e589630cd3e1825cc54da3104b3bacd1c117b48e9cb758eb053722037d550" Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.491396 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv24g"] Feb 25 07:59:11 crc kubenswrapper[4749]: I0225 07:59:11.502887 4749 scope.go:117] "RemoveContainer" containerID="8f9240ea52ada703203bc660407dc20a5919a12d659ef87b276a1946a7d6ca3a" Feb 25 07:59:13 crc kubenswrapper[4749]: I0225 07:59:13.338751 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" path="/var/lib/kubelet/pods/45225dff-f438-45e3-bccb-bf2c2d52f4e4/volumes" Feb 25 07:59:14 crc kubenswrapper[4749]: I0225 07:59:14.322124 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:59:14 crc kubenswrapper[4749]: E0225 07:59:14.322711 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:59:28 crc kubenswrapper[4749]: I0225 07:59:28.322286 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:59:28 crc kubenswrapper[4749]: E0225 07:59:28.323399 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:59:39 crc kubenswrapper[4749]: I0225 07:59:39.323425 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:59:39 crc kubenswrapper[4749]: E0225 07:59:39.324531 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 07:59:53 crc kubenswrapper[4749]: I0225 07:59:53.323277 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 07:59:53 crc kubenswrapper[4749]: E0225 07:59:53.324048 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.157681 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533440-gbrqs"] Feb 25 08:00:00 crc kubenswrapper[4749]: E0225 08:00:00.159514 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="extract-utilities" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.159536 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="extract-utilities" Feb 25 08:00:00 crc kubenswrapper[4749]: E0225 08:00:00.159556 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="extract-content" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.159565 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="extract-content" Feb 25 08:00:00 crc kubenswrapper[4749]: E0225 08:00:00.159622 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="registry-server" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.159630 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="registry-server" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.159870 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="45225dff-f438-45e3-bccb-bf2c2d52f4e4" containerName="registry-server" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.160562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.163111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.164633 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.165145 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.175545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533440-gbrqs"] Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.203880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xvj\" (UniqueName: \"kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj\") pod \"auto-csr-approver-29533440-gbrqs\" (UID: \"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b\") " pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.257191 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc"] Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.258562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.260768 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.261363 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.266022 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc"] Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.306437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xvj\" (UniqueName: \"kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj\") pod \"auto-csr-approver-29533440-gbrqs\" (UID: \"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b\") " pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.306756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.306891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sx86\" (UniqueName: \"kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.306991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.336473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xvj\" (UniqueName: \"kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj\") pod \"auto-csr-approver-29533440-gbrqs\" (UID: \"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b\") " pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.408127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.408227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sx86\" (UniqueName: \"kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.408271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.409856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.412037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.431191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sx86\" (UniqueName: \"kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86\") pod \"collect-profiles-29533440-8xhwc\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.482107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.579766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.954384 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533440-gbrqs"] Feb 25 08:00:00 crc kubenswrapper[4749]: I0225 08:00:00.997843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" event={"ID":"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b","Type":"ContainerStarted","Data":"4264f436e89bba7fffec33a64083f6723fa4330c3bd24949d9990b06edc8b950"} Feb 25 08:00:01 crc kubenswrapper[4749]: W0225 08:00:01.025729 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod231ef4ae_295d_4fb2_a2dc_f77d87e098f3.slice/crio-66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25 WatchSource:0}: Error finding container 66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25: Status 404 returned error can't find the container with id 66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25 Feb 25 08:00:01 crc kubenswrapper[4749]: I0225 08:00:01.029697 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc"] Feb 25 08:00:02 crc kubenswrapper[4749]: I0225 08:00:02.022776 4749 generic.go:334] "Generic (PLEG): container finished" podID="231ef4ae-295d-4fb2-a2dc-f77d87e098f3" containerID="b592f2edc4e0016276fcf187ac7b7a7aa269be945ea368d03bee9ccf4645d93c" exitCode=0 Feb 25 08:00:02 crc kubenswrapper[4749]: I0225 08:00:02.022885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" event={"ID":"231ef4ae-295d-4fb2-a2dc-f77d87e098f3","Type":"ContainerDied","Data":"b592f2edc4e0016276fcf187ac7b7a7aa269be945ea368d03bee9ccf4645d93c"} Feb 25 08:00:02 crc kubenswrapper[4749]: I0225 08:00:02.023028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" event={"ID":"231ef4ae-295d-4fb2-a2dc-f77d87e098f3","Type":"ContainerStarted","Data":"66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25"} Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.378969 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.485886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume\") pod \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.485949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume\") pod \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.486281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sx86\" (UniqueName: \"kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86\") pod \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\" (UID: \"231ef4ae-295d-4fb2-a2dc-f77d87e098f3\") " Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.486562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "231ef4ae-295d-4fb2-a2dc-f77d87e098f3" (UID: "231ef4ae-295d-4fb2-a2dc-f77d87e098f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.486770 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.492105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "231ef4ae-295d-4fb2-a2dc-f77d87e098f3" (UID: "231ef4ae-295d-4fb2-a2dc-f77d87e098f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.492782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86" (OuterVolumeSpecName: "kube-api-access-7sx86") pod "231ef4ae-295d-4fb2-a2dc-f77d87e098f3" (UID: "231ef4ae-295d-4fb2-a2dc-f77d87e098f3"). InnerVolumeSpecName "kube-api-access-7sx86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.588798 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sx86\" (UniqueName: \"kubernetes.io/projected/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-kube-api-access-7sx86\") on node \"crc\" DevicePath \"\"" Feb 25 08:00:03 crc kubenswrapper[4749]: I0225 08:00:03.588827 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231ef4ae-295d-4fb2-a2dc-f77d87e098f3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:00:04 crc kubenswrapper[4749]: I0225 08:00:04.057852 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" event={"ID":"231ef4ae-295d-4fb2-a2dc-f77d87e098f3","Type":"ContainerDied","Data":"66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25"} Feb 25 08:00:04 crc kubenswrapper[4749]: I0225 08:00:04.057898 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e6f60cac49631d2bbc74a35d3a4507c8a87040050cf4f220e5853c3edd1e25" Feb 25 08:00:04 crc kubenswrapper[4749]: I0225 08:00:04.057944 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533440-8xhwc" Feb 25 08:00:04 crc kubenswrapper[4749]: I0225 08:00:04.477182 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg"] Feb 25 08:00:04 crc kubenswrapper[4749]: I0225 08:00:04.486563 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533395-7vrbg"] Feb 25 08:00:05 crc kubenswrapper[4749]: I0225 08:00:05.066918 4749 generic.go:334] "Generic (PLEG): container finished" podID="001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" containerID="5c62684f9bd9fa546ef6e84b055dd05993c8816acbcc3720ec57f5fced79fa79" exitCode=0 Feb 25 08:00:05 crc kubenswrapper[4749]: I0225 08:00:05.066967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" event={"ID":"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b","Type":"ContainerDied","Data":"5c62684f9bd9fa546ef6e84b055dd05993c8816acbcc3720ec57f5fced79fa79"} Feb 25 08:00:05 crc kubenswrapper[4749]: I0225 08:00:05.337687 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec106c3-2cb1-4c08-8b5a-920797ec4142" path="/var/lib/kubelet/pods/bec106c3-2cb1-4c08-8b5a-920797ec4142/volumes" Feb 25 08:00:06 crc kubenswrapper[4749]: I0225 08:00:06.511557 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:06 crc kubenswrapper[4749]: I0225 08:00:06.651380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9xvj\" (UniqueName: \"kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj\") pod \"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b\" (UID: \"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b\") " Feb 25 08:00:06 crc kubenswrapper[4749]: I0225 08:00:06.657258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj" (OuterVolumeSpecName: "kube-api-access-t9xvj") pod "001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" (UID: "001307fc-d97a-42a9-91b4-1b0b9bb5ba4b"). InnerVolumeSpecName "kube-api-access-t9xvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:00:06 crc kubenswrapper[4749]: I0225 08:00:06.753633 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9xvj\" (UniqueName: \"kubernetes.io/projected/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b-kube-api-access-t9xvj\") on node \"crc\" DevicePath \"\"" Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.088003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" event={"ID":"001307fc-d97a-42a9-91b4-1b0b9bb5ba4b","Type":"ContainerDied","Data":"4264f436e89bba7fffec33a64083f6723fa4330c3bd24949d9990b06edc8b950"} Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.088054 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4264f436e89bba7fffec33a64083f6723fa4330c3bd24949d9990b06edc8b950" Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.088066 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533440-gbrqs" Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.329935 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:00:07 crc kubenswrapper[4749]: E0225 08:00:07.330265 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.585394 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533434-qhzxx"] Feb 25 08:00:07 crc kubenswrapper[4749]: I0225 08:00:07.598536 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533434-qhzxx"] Feb 25 08:00:09 crc kubenswrapper[4749]: I0225 08:00:09.336946 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1f1f44-25ca-48dd-9cf3-8fb69f33739f" path="/var/lib/kubelet/pods/6d1f1f44-25ca-48dd-9cf3-8fb69f33739f/volumes" Feb 25 08:00:19 crc kubenswrapper[4749]: I0225 08:00:19.323487 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:00:19 crc kubenswrapper[4749]: E0225 08:00:19.324567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:00:33 crc kubenswrapper[4749]: I0225 08:00:33.323443 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:00:33 crc kubenswrapper[4749]: E0225 08:00:33.325986 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:00:44 crc kubenswrapper[4749]: I0225 08:00:44.322765 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:00:44 crc kubenswrapper[4749]: E0225 08:00:44.323893 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:00:45 crc kubenswrapper[4749]: I0225 08:00:45.509498 4749 scope.go:117] "RemoveContainer" containerID="3e1f4ce278983d29157591a2b038e7aa63556219a96f8a79dba5e9130a1a0df4" Feb 25 08:00:45 crc kubenswrapper[4749]: I0225 08:00:45.555491 4749 scope.go:117] "RemoveContainer" containerID="cd7f0f03c419146b9f4ba5b1915c124c4c8f9c5a39f0ea20b07f2f4364ddbef3" Feb 25 08:00:56 crc kubenswrapper[4749]: I0225 08:00:56.322628 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:00:56 crc kubenswrapper[4749]: E0225 08:00:56.323761 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.157558 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29533441-4v5xd"] Feb 25 08:01:00 crc kubenswrapper[4749]: E0225 08:01:00.158838 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" containerName="oc" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.158859 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" containerName="oc" Feb 25 08:01:00 crc kubenswrapper[4749]: E0225 08:01:00.158910 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231ef4ae-295d-4fb2-a2dc-f77d87e098f3" containerName="collect-profiles" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.158921 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="231ef4ae-295d-4fb2-a2dc-f77d87e098f3" containerName="collect-profiles" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.159203 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" containerName="oc" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.159241 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="231ef4ae-295d-4fb2-a2dc-f77d87e098f3" containerName="collect-profiles" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.160448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.175780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533441-4v5xd"] Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.241233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.241314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbp4t\" (UniqueName: \"kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.241432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.241558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.343317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.343830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbp4t\" (UniqueName: \"kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.344030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.344295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.350533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.352480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.353678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.366901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbp4t\" (UniqueName: \"kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t\") pod \"keystone-cron-29533441-4v5xd\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.486094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:00 crc kubenswrapper[4749]: I0225 08:01:00.967364 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29533441-4v5xd"] Feb 25 08:01:01 crc kubenswrapper[4749]: I0225 08:01:01.801815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533441-4v5xd" event={"ID":"4b21ba16-5e25-41fd-afbb-82072bfdc006","Type":"ContainerStarted","Data":"3aa1b93647733e749aedae798bb4eba2a03db803fba907ea500583cd81c0d39a"} Feb 25 08:01:01 crc kubenswrapper[4749]: I0225 08:01:01.802244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533441-4v5xd" event={"ID":"4b21ba16-5e25-41fd-afbb-82072bfdc006","Type":"ContainerStarted","Data":"0ed21160661262e01af52a090f858ebd41622d7777e3f16ff33dcfa70d3778b0"} Feb 25 08:01:01 crc kubenswrapper[4749]: I0225 08:01:01.826830 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29533441-4v5xd" podStartSLOduration=1.8268071049999999 podStartE2EDuration="1.826807105s" podCreationTimestamp="2026-02-25 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 08:01:01.826180421 +0000 UTC m=+2615.188006451" watchObservedRunningTime="2026-02-25 08:01:01.826807105 +0000 UTC m=+2615.188633145" Feb 25 08:01:03 crc kubenswrapper[4749]: I0225 08:01:03.825792 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b21ba16-5e25-41fd-afbb-82072bfdc006" containerID="3aa1b93647733e749aedae798bb4eba2a03db803fba907ea500583cd81c0d39a" exitCode=0 Feb 25 08:01:03 crc kubenswrapper[4749]: I0225 08:01:03.825913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533441-4v5xd" event={"ID":"4b21ba16-5e25-41fd-afbb-82072bfdc006","Type":"ContainerDied","Data":"3aa1b93647733e749aedae798bb4eba2a03db803fba907ea500583cd81c0d39a"} Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.172613 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.269632 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle\") pod \"4b21ba16-5e25-41fd-afbb-82072bfdc006\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.269671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data\") pod \"4b21ba16-5e25-41fd-afbb-82072bfdc006\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.269715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbp4t\" (UniqueName: \"kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t\") pod \"4b21ba16-5e25-41fd-afbb-82072bfdc006\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.269775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys\") pod \"4b21ba16-5e25-41fd-afbb-82072bfdc006\" (UID: \"4b21ba16-5e25-41fd-afbb-82072bfdc006\") " Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.280027 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4b21ba16-5e25-41fd-afbb-82072bfdc006" (UID: "4b21ba16-5e25-41fd-afbb-82072bfdc006"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.281046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t" (OuterVolumeSpecName: "kube-api-access-nbp4t") pod "4b21ba16-5e25-41fd-afbb-82072bfdc006" (UID: "4b21ba16-5e25-41fd-afbb-82072bfdc006"). InnerVolumeSpecName "kube-api-access-nbp4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.318891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b21ba16-5e25-41fd-afbb-82072bfdc006" (UID: "4b21ba16-5e25-41fd-afbb-82072bfdc006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.344225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data" (OuterVolumeSpecName: "config-data") pod "4b21ba16-5e25-41fd-afbb-82072bfdc006" (UID: "4b21ba16-5e25-41fd-afbb-82072bfdc006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.371765 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.371810 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.371823 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbp4t\" (UniqueName: \"kubernetes.io/projected/4b21ba16-5e25-41fd-afbb-82072bfdc006-kube-api-access-nbp4t\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.371836 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4b21ba16-5e25-41fd-afbb-82072bfdc006-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.849638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29533441-4v5xd" event={"ID":"4b21ba16-5e25-41fd-afbb-82072bfdc006","Type":"ContainerDied","Data":"0ed21160661262e01af52a090f858ebd41622d7777e3f16ff33dcfa70d3778b0"} Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.849691 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed21160661262e01af52a090f858ebd41622d7777e3f16ff33dcfa70d3778b0" Feb 25 08:01:05 crc kubenswrapper[4749]: I0225 08:01:05.849718 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29533441-4v5xd" Feb 25 08:01:10 crc kubenswrapper[4749]: I0225 08:01:10.322981 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:01:10 crc kubenswrapper[4749]: E0225 08:01:10.323997 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:01:21 crc kubenswrapper[4749]: I0225 08:01:21.323652 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:01:21 crc kubenswrapper[4749]: E0225 08:01:21.324674 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:01:22 crc kubenswrapper[4749]: I0225 08:01:22.034290 4749 generic.go:334] "Generic (PLEG): container finished" podID="18d17a7c-1c9d-47bc-818d-c2f567dfe075" containerID="3527de6d4f863b0fa9040bd40f1d4f334df397b30f338fd70dea28c025fe5e95" exitCode=0 Feb 25 08:01:22 crc kubenswrapper[4749]: I0225 08:01:22.034487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" event={"ID":"18d17a7c-1c9d-47bc-818d-c2f567dfe075","Type":"ContainerDied","Data":"3527de6d4f863b0fa9040bd40f1d4f334df397b30f338fd70dea28c025fe5e95"} Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.565182 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.677520 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.677915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.678004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.678072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pvq9\" (UniqueName: \"kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.678145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.678228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.678352 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory\") pod \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\" (UID: \"18d17a7c-1c9d-47bc-818d-c2f567dfe075\") " Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.683994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.684561 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9" (OuterVolumeSpecName: "kube-api-access-2pvq9") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "kube-api-access-2pvq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.709882 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.716894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.717477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.721504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory" (OuterVolumeSpecName: "inventory") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.733845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "18d17a7c-1c9d-47bc-818d-c2f567dfe075" (UID: "18d17a7c-1c9d-47bc-818d-c2f567dfe075"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780156 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780191 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pvq9\" (UniqueName: \"kubernetes.io/projected/18d17a7c-1c9d-47bc-818d-c2f567dfe075-kube-api-access-2pvq9\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780207 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780218 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780230 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-inventory\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780244 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:23 crc kubenswrapper[4749]: I0225 08:01:23.780260 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/18d17a7c-1c9d-47bc-818d-c2f567dfe075-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 25 08:01:24 crc kubenswrapper[4749]: I0225 08:01:24.064273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" event={"ID":"18d17a7c-1c9d-47bc-818d-c2f567dfe075","Type":"ContainerDied","Data":"63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb"} Feb 25 08:01:24 crc kubenswrapper[4749]: I0225 08:01:24.064342 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63139002daeca7b63bda30aa3d4f047eaa267784ad3e4a8a519279f25a37f9eb" Feb 25 08:01:24 crc kubenswrapper[4749]: I0225 08:01:24.064421 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq" Feb 25 08:01:35 crc kubenswrapper[4749]: I0225 08:01:35.323050 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:01:36 crc kubenswrapper[4749]: I0225 08:01:36.202130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7"} Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.153399 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533442-8qwtn"] Feb 25 08:02:00 crc kubenswrapper[4749]: E0225 08:02:00.154581 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b21ba16-5e25-41fd-afbb-82072bfdc006" containerName="keystone-cron" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.154626 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b21ba16-5e25-41fd-afbb-82072bfdc006" containerName="keystone-cron" Feb 25 08:02:00 crc kubenswrapper[4749]: E0225 08:02:00.154679 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d17a7c-1c9d-47bc-818d-c2f567dfe075" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.154694 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d17a7c-1c9d-47bc-818d-c2f567dfe075" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.155011 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d17a7c-1c9d-47bc-818d-c2f567dfe075" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.155031 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b21ba16-5e25-41fd-afbb-82072bfdc006" containerName="keystone-cron" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.155947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.159111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.159874 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.161219 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.167826 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533442-8qwtn"] Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.217742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6qj\" (UniqueName: \"kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj\") pod \"auto-csr-approver-29533442-8qwtn\" (UID: \"b11487c4-c25a-48ba-a642-03d6d05630fc\") " pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.319790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6qj\" (UniqueName: \"kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj\") pod \"auto-csr-approver-29533442-8qwtn\" (UID: \"b11487c4-c25a-48ba-a642-03d6d05630fc\") " pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.346428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6qj\" (UniqueName: \"kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj\") pod \"auto-csr-approver-29533442-8qwtn\" (UID: \"b11487c4-c25a-48ba-a642-03d6d05630fc\") " pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.489906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:00 crc kubenswrapper[4749]: I0225 08:02:00.950808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533442-8qwtn"] Feb 25 08:02:00 crc kubenswrapper[4749]: W0225 08:02:00.954697 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb11487c4_c25a_48ba_a642_03d6d05630fc.slice/crio-ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc WatchSource:0}: Error finding container ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc: Status 404 returned error can't find the container with id ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc Feb 25 08:02:01 crc kubenswrapper[4749]: I0225 08:02:01.472288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" event={"ID":"b11487c4-c25a-48ba-a642-03d6d05630fc","Type":"ContainerStarted","Data":"ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc"} Feb 25 08:02:02 crc kubenswrapper[4749]: I0225 08:02:02.487400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" event={"ID":"b11487c4-c25a-48ba-a642-03d6d05630fc","Type":"ContainerStarted","Data":"a6de9b850659488c84ca92c7c09ea9050b7377ebf7f52069ed917d0d3d3885f8"} Feb 25 08:02:02 crc kubenswrapper[4749]: I0225 08:02:02.512318 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" podStartSLOduration=1.436459223 podStartE2EDuration="2.512297193s" podCreationTimestamp="2026-02-25 08:02:00 +0000 UTC" firstStartedPulling="2026-02-25 08:02:00.958196326 +0000 UTC m=+2674.320022346" lastFinishedPulling="2026-02-25 08:02:02.034034286 +0000 UTC m=+2675.395860316" observedRunningTime="2026-02-25 08:02:02.500388334 +0000 UTC m=+2675.862214354" watchObservedRunningTime="2026-02-25 08:02:02.512297193 +0000 UTC m=+2675.874123223" Feb 25 08:02:03 crc kubenswrapper[4749]: I0225 08:02:03.502324 4749 generic.go:334] "Generic (PLEG): container finished" podID="b11487c4-c25a-48ba-a642-03d6d05630fc" containerID="a6de9b850659488c84ca92c7c09ea9050b7377ebf7f52069ed917d0d3d3885f8" exitCode=0 Feb 25 08:02:03 crc kubenswrapper[4749]: I0225 08:02:03.502382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" event={"ID":"b11487c4-c25a-48ba-a642-03d6d05630fc","Type":"ContainerDied","Data":"a6de9b850659488c84ca92c7c09ea9050b7377ebf7f52069ed917d0d3d3885f8"} Feb 25 08:02:04 crc kubenswrapper[4749]: I0225 08:02:04.962927 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.013839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6qj\" (UniqueName: \"kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj\") pod \"b11487c4-c25a-48ba-a642-03d6d05630fc\" (UID: \"b11487c4-c25a-48ba-a642-03d6d05630fc\") " Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.019956 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj" (OuterVolumeSpecName: "kube-api-access-xm6qj") pod "b11487c4-c25a-48ba-a642-03d6d05630fc" (UID: "b11487c4-c25a-48ba-a642-03d6d05630fc"). InnerVolumeSpecName "kube-api-access-xm6qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.115992 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6qj\" (UniqueName: \"kubernetes.io/projected/b11487c4-c25a-48ba-a642-03d6d05630fc-kube-api-access-xm6qj\") on node \"crc\" DevicePath \"\"" Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.537751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" event={"ID":"b11487c4-c25a-48ba-a642-03d6d05630fc","Type":"ContainerDied","Data":"ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc"} Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.537813 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac31d8c71c81de0fea68b85c546f5be12c97701e0806cea2e1d5a6177ab05cbc" Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.537826 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533442-8qwtn" Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.610040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533436-gdfsc"] Feb 25 08:02:05 crc kubenswrapper[4749]: I0225 08:02:05.624304 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533436-gdfsc"] Feb 25 08:02:07 crc kubenswrapper[4749]: I0225 08:02:07.340391 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="078b859c-384b-4651-857d-f6337e808e8c" path="/var/lib/kubelet/pods/078b859c-384b-4651-857d-f6337e808e8c/volumes" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.519822 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 08:02:16 crc kubenswrapper[4749]: E0225 08:02:16.520865 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11487c4-c25a-48ba-a642-03d6d05630fc" containerName="oc" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.520887 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11487c4-c25a-48ba-a642-03d6d05630fc" containerName="oc" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.521179 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11487c4-c25a-48ba-a642-03d6d05630fc" containerName="oc" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.521950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.524017 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.524349 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sdgfx" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.528221 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.528248 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.532454 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.661322 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.661853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.661884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.661985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.662022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.662520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.662782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42l8h\" (UniqueName: \"kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.662985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.663255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764760 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.764958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.765000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42l8h\" (UniqueName: \"kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.765026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.765070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.765563 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.765885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.766120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.766182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.767289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.771038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.771369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.779651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.789454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42l8h\" (UniqueName: \"kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.803214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " pod="openstack/tempest-tests-tempest" Feb 25 08:02:16 crc kubenswrapper[4749]: I0225 08:02:16.862256 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 08:02:17 crc kubenswrapper[4749]: I0225 08:02:17.367734 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 25 08:02:17 crc kubenswrapper[4749]: I0225 08:02:17.686249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ffbba01-9e0c-4754-a378-68eaba4c858e","Type":"ContainerStarted","Data":"c5fbae8c632b91847dffe386009e2424fa51cdd582e6a9d232a64089f59a43e7"} Feb 25 08:02:45 crc kubenswrapper[4749]: I0225 08:02:45.728979 4749 scope.go:117] "RemoveContainer" containerID="a2fe4adc72ac95efe71e1f00fd9518b18143acf1d1828ef38f9155cf17242757" Feb 25 08:02:53 crc kubenswrapper[4749]: E0225 08:02:53.068440 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 25 08:02:53 crc kubenswrapper[4749]: E0225 08:02:53.068911 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42l8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(6ffbba01-9e0c-4754-a378-68eaba4c858e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 25 08:02:53 crc kubenswrapper[4749]: E0225 08:02:53.070136 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="6ffbba01-9e0c-4754-a378-68eaba4c858e" Feb 25 08:02:53 crc kubenswrapper[4749]: E0225 08:02:53.087834 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="6ffbba01-9e0c-4754-a378-68eaba4c858e" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.743875 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.746125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.772714 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.868878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzgw\" (UniqueName: \"kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.871894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.872080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.973793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.973907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzgw\" (UniqueName: \"kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.973996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.974525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.974516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:58 crc kubenswrapper[4749]: I0225 08:02:58.998020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzgw\" (UniqueName: \"kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw\") pod \"certified-operators-4cn4g\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:59 crc kubenswrapper[4749]: I0225 08:02:59.067556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:02:59 crc kubenswrapper[4749]: I0225 08:02:59.562120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:03:00 crc kubenswrapper[4749]: I0225 08:03:00.159437 4749 generic.go:334] "Generic (PLEG): container finished" podID="07d80094-5bee-4946-97c0-b89d44c9206f" containerID="6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4" exitCode=0 Feb 25 08:03:00 crc kubenswrapper[4749]: I0225 08:03:00.159904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerDied","Data":"6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4"} Feb 25 08:03:00 crc kubenswrapper[4749]: I0225 08:03:00.159949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerStarted","Data":"bc4fd59f6d15511cd34bcb5991a20ccd9962a7d10faf45d6df487eb55463bbc7"} Feb 25 08:03:01 crc kubenswrapper[4749]: I0225 08:03:01.171789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerStarted","Data":"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9"} Feb 25 08:03:02 crc kubenswrapper[4749]: I0225 08:03:02.184316 4749 generic.go:334] "Generic (PLEG): container finished" podID="07d80094-5bee-4946-97c0-b89d44c9206f" containerID="d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9" exitCode=0 Feb 25 08:03:02 crc kubenswrapper[4749]: I0225 08:03:02.184441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerDied","Data":"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9"} Feb 25 08:03:02 crc kubenswrapper[4749]: I0225 08:03:02.187989 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:03:03 crc kubenswrapper[4749]: I0225 08:03:03.199511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerStarted","Data":"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472"} Feb 25 08:03:03 crc kubenswrapper[4749]: I0225 08:03:03.226402 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4cn4g" podStartSLOduration=2.7519297480000002 podStartE2EDuration="5.226379562s" podCreationTimestamp="2026-02-25 08:02:58 +0000 UTC" firstStartedPulling="2026-02-25 08:03:00.162562233 +0000 UTC m=+2733.524388293" lastFinishedPulling="2026-02-25 08:03:02.637012047 +0000 UTC m=+2735.998838107" observedRunningTime="2026-02-25 08:03:03.218145243 +0000 UTC m=+2736.579971283" watchObservedRunningTime="2026-02-25 08:03:03.226379562 +0000 UTC m=+2736.588205592" Feb 25 08:03:07 crc kubenswrapper[4749]: I0225 08:03:07.809168 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.068000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.068772 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.147692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.272691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ffbba01-9e0c-4754-a378-68eaba4c858e","Type":"ContainerStarted","Data":"3c695c1d346e1e87ef6ef8437acb69045ea5e0aa45604ee313f38a60d669b127"} Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.309100 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.865535966 podStartE2EDuration="54.309070791s" podCreationTimestamp="2026-02-25 08:02:15 +0000 UTC" firstStartedPulling="2026-02-25 08:02:17.359718619 +0000 UTC m=+2690.721544679" lastFinishedPulling="2026-02-25 08:03:07.803253444 +0000 UTC m=+2741.165079504" observedRunningTime="2026-02-25 08:03:09.294816136 +0000 UTC m=+2742.656642166" watchObservedRunningTime="2026-02-25 08:03:09.309070791 +0000 UTC m=+2742.670896851" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.355023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:09 crc kubenswrapper[4749]: I0225 08:03:09.417770 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.295688 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4cn4g" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="registry-server" containerID="cri-o://a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472" gracePeriod=2 Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.815633 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.918230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content\") pod \"07d80094-5bee-4946-97c0-b89d44c9206f\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.918712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities\") pod \"07d80094-5bee-4946-97c0-b89d44c9206f\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.918763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nzgw\" (UniqueName: \"kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw\") pod \"07d80094-5bee-4946-97c0-b89d44c9206f\" (UID: \"07d80094-5bee-4946-97c0-b89d44c9206f\") " Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.920161 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities" (OuterVolumeSpecName: "utilities") pod "07d80094-5bee-4946-97c0-b89d44c9206f" (UID: "07d80094-5bee-4946-97c0-b89d44c9206f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.928422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw" (OuterVolumeSpecName: "kube-api-access-6nzgw") pod "07d80094-5bee-4946-97c0-b89d44c9206f" (UID: "07d80094-5bee-4946-97c0-b89d44c9206f"). InnerVolumeSpecName "kube-api-access-6nzgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:03:11 crc kubenswrapper[4749]: I0225 08:03:11.980201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07d80094-5bee-4946-97c0-b89d44c9206f" (UID: "07d80094-5bee-4946-97c0-b89d44c9206f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.022739 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.022765 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nzgw\" (UniqueName: \"kubernetes.io/projected/07d80094-5bee-4946-97c0-b89d44c9206f-kube-api-access-6nzgw\") on node \"crc\" DevicePath \"\"" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.022777 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d80094-5bee-4946-97c0-b89d44c9206f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.308525 4749 generic.go:334] "Generic (PLEG): container finished" podID="07d80094-5bee-4946-97c0-b89d44c9206f" containerID="a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472" exitCode=0 Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.308565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerDied","Data":"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472"} Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.308622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4cn4g" event={"ID":"07d80094-5bee-4946-97c0-b89d44c9206f","Type":"ContainerDied","Data":"bc4fd59f6d15511cd34bcb5991a20ccd9962a7d10faf45d6df487eb55463bbc7"} Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.308639 4749 scope.go:117] "RemoveContainer" containerID="a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.308645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4cn4g" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.348627 4749 scope.go:117] "RemoveContainer" containerID="d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.353233 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.366027 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4cn4g"] Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.374788 4749 scope.go:117] "RemoveContainer" containerID="6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.414330 4749 scope.go:117] "RemoveContainer" containerID="a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472" Feb 25 08:03:12 crc kubenswrapper[4749]: E0225 08:03:12.415160 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472\": container with ID starting with a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472 not found: ID does not exist" containerID="a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.415212 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472"} err="failed to get container status \"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472\": rpc error: code = NotFound desc = could not find container \"a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472\": container with ID starting with a9dfbdac0aaa0c898c86ab04911007fdf42ff2affba5c7fee26c41cf74378472 not found: ID does not exist" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.415246 4749 scope.go:117] "RemoveContainer" containerID="d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9" Feb 25 08:03:12 crc kubenswrapper[4749]: E0225 08:03:12.415990 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9\": container with ID starting with d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9 not found: ID does not exist" containerID="d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.416194 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9"} err="failed to get container status \"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9\": rpc error: code = NotFound desc = could not find container \"d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9\": container with ID starting with d819e3001fef83ce52c08d00259694fd5c6201eb26901c8fd3b6e122dcc9a6c9 not found: ID does not exist" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.416365 4749 scope.go:117] "RemoveContainer" containerID="6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4" Feb 25 08:03:12 crc kubenswrapper[4749]: E0225 08:03:12.416936 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4\": container with ID starting with 6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4 not found: ID does not exist" containerID="6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4" Feb 25 08:03:12 crc kubenswrapper[4749]: I0225 08:03:12.416985 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4"} err="failed to get container status \"6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4\": rpc error: code = NotFound desc = could not find container \"6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4\": container with ID starting with 6181b7ba029d9659a44ea940b38d9e90ab944e222e5c3a4415cde826041fdbd4 not found: ID does not exist" Feb 25 08:03:13 crc kubenswrapper[4749]: I0225 08:03:13.334260 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" path="/var/lib/kubelet/pods/07d80094-5bee-4946-97c0-b89d44c9206f/volumes" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.599229 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:03:47 crc kubenswrapper[4749]: E0225 08:03:47.600207 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="extract-content" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.600224 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="extract-content" Feb 25 08:03:47 crc kubenswrapper[4749]: E0225 08:03:47.600242 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="registry-server" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.600250 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="registry-server" Feb 25 08:03:47 crc kubenswrapper[4749]: E0225 08:03:47.600269 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="extract-utilities" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.600282 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="extract-utilities" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.600504 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d80094-5bee-4946-97c0-b89d44c9206f" containerName="registry-server" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.601789 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.649540 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.670570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.670655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrzj\" (UniqueName: \"kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.670804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.772181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.772229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrzj\" (UniqueName: \"kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.772331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.772855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.772855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.794216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrzj\" (UniqueName: \"kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj\") pod \"redhat-operators-nh72n\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:47 crc kubenswrapper[4749]: I0225 08:03:47.951720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:48 crc kubenswrapper[4749]: I0225 08:03:48.458731 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:03:48 crc kubenswrapper[4749]: I0225 08:03:48.717822 4749 generic.go:334] "Generic (PLEG): container finished" podID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerID="b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1" exitCode=0 Feb 25 08:03:48 crc kubenswrapper[4749]: I0225 08:03:48.717892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerDied","Data":"b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1"} Feb 25 08:03:48 crc kubenswrapper[4749]: I0225 08:03:48.717942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerStarted","Data":"786d95f6202e5dff3e7bf6075d693ebe7c7fc1b7909c2b00473b19315c3a314b"} Feb 25 08:03:49 crc kubenswrapper[4749]: I0225 08:03:49.731675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerStarted","Data":"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee"} Feb 25 08:03:51 crc kubenswrapper[4749]: I0225 08:03:51.671514 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:03:51 crc kubenswrapper[4749]: I0225 08:03:51.672048 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:03:52 crc kubenswrapper[4749]: I0225 08:03:52.766135 4749 generic.go:334] "Generic (PLEG): container finished" podID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerID="2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee" exitCode=0 Feb 25 08:03:52 crc kubenswrapper[4749]: I0225 08:03:52.766360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerDied","Data":"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee"} Feb 25 08:03:53 crc kubenswrapper[4749]: I0225 08:03:53.779048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerStarted","Data":"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc"} Feb 25 08:03:53 crc kubenswrapper[4749]: I0225 08:03:53.797283 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nh72n" podStartSLOduration=2.306379768 podStartE2EDuration="6.797266873s" podCreationTimestamp="2026-02-25 08:03:47 +0000 UTC" firstStartedPulling="2026-02-25 08:03:48.721817287 +0000 UTC m=+2782.083643297" lastFinishedPulling="2026-02-25 08:03:53.212704342 +0000 UTC m=+2786.574530402" observedRunningTime="2026-02-25 08:03:53.792324584 +0000 UTC m=+2787.154150604" watchObservedRunningTime="2026-02-25 08:03:53.797266873 +0000 UTC m=+2787.159092903" Feb 25 08:03:57 crc kubenswrapper[4749]: I0225 08:03:57.952150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:57 crc kubenswrapper[4749]: I0225 08:03:57.952974 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:03:59 crc kubenswrapper[4749]: I0225 08:03:59.021410 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nh72n" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="registry-server" probeResult="failure" output=< Feb 25 08:03:59 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 08:03:59 crc kubenswrapper[4749]: > Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.148198 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533444-ptnz2"] Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.151193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.155515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.156990 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.157054 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.160371 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533444-ptnz2"] Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.339567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshpq\" (UniqueName: \"kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq\") pod \"auto-csr-approver-29533444-ptnz2\" (UID: \"81152f8c-3d00-497e-a709-71740dbc092c\") " pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.441762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bshpq\" (UniqueName: \"kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq\") pod \"auto-csr-approver-29533444-ptnz2\" (UID: \"81152f8c-3d00-497e-a709-71740dbc092c\") " pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.462941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshpq\" (UniqueName: \"kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq\") pod \"auto-csr-approver-29533444-ptnz2\" (UID: \"81152f8c-3d00-497e-a709-71740dbc092c\") " pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.480808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:00 crc kubenswrapper[4749]: I0225 08:04:00.953033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533444-ptnz2"] Feb 25 08:04:01 crc kubenswrapper[4749]: I0225 08:04:01.848687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" event={"ID":"81152f8c-3d00-497e-a709-71740dbc092c","Type":"ContainerStarted","Data":"233301617a76b4362e472f510380ed86ba0f6c1fed48f558e758d3dee6fa1628"} Feb 25 08:04:02 crc kubenswrapper[4749]: I0225 08:04:02.863682 4749 generic.go:334] "Generic (PLEG): container finished" podID="81152f8c-3d00-497e-a709-71740dbc092c" containerID="4692b55371dbc5eb55e665e454bafe96f70100db05720deeeafc7fe996f603a3" exitCode=0 Feb 25 08:04:02 crc kubenswrapper[4749]: I0225 08:04:02.863786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" event={"ID":"81152f8c-3d00-497e-a709-71740dbc092c","Type":"ContainerDied","Data":"4692b55371dbc5eb55e665e454bafe96f70100db05720deeeafc7fe996f603a3"} Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.255135 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.422090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bshpq\" (UniqueName: \"kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq\") pod \"81152f8c-3d00-497e-a709-71740dbc092c\" (UID: \"81152f8c-3d00-497e-a709-71740dbc092c\") " Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.428517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq" (OuterVolumeSpecName: "kube-api-access-bshpq") pod "81152f8c-3d00-497e-a709-71740dbc092c" (UID: "81152f8c-3d00-497e-a709-71740dbc092c"). InnerVolumeSpecName "kube-api-access-bshpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.525013 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bshpq\" (UniqueName: \"kubernetes.io/projected/81152f8c-3d00-497e-a709-71740dbc092c-kube-api-access-bshpq\") on node \"crc\" DevicePath \"\"" Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.881539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" event={"ID":"81152f8c-3d00-497e-a709-71740dbc092c","Type":"ContainerDied","Data":"233301617a76b4362e472f510380ed86ba0f6c1fed48f558e758d3dee6fa1628"} Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.881893 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="233301617a76b4362e472f510380ed86ba0f6c1fed48f558e758d3dee6fa1628" Feb 25 08:04:04 crc kubenswrapper[4749]: I0225 08:04:04.881612 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533444-ptnz2" Feb 25 08:04:05 crc kubenswrapper[4749]: I0225 08:04:05.332872 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533438-d6qkv"] Feb 25 08:04:05 crc kubenswrapper[4749]: I0225 08:04:05.339783 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533438-d6qkv"] Feb 25 08:04:07 crc kubenswrapper[4749]: I0225 08:04:07.333921 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181c0d3e-7868-480b-b159-f0315cb19dd4" path="/var/lib/kubelet/pods/181c0d3e-7868-480b-b159-f0315cb19dd4/volumes" Feb 25 08:04:08 crc kubenswrapper[4749]: I0225 08:04:08.006680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:04:08 crc kubenswrapper[4749]: I0225 08:04:08.060951 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:04:08 crc kubenswrapper[4749]: I0225 08:04:08.246902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:04:09 crc kubenswrapper[4749]: I0225 08:04:09.929429 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nh72n" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="registry-server" containerID="cri-o://d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc" gracePeriod=2 Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.533734 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.643611 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbrzj\" (UniqueName: \"kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj\") pod \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.643806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content\") pod \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.644497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities\") pod \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\" (UID: \"cec4f3d6-d73a-4f74-acd4-f56658defc4a\") " Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.645116 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities" (OuterVolumeSpecName: "utilities") pod "cec4f3d6-d73a-4f74-acd4-f56658defc4a" (UID: "cec4f3d6-d73a-4f74-acd4-f56658defc4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.651124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj" (OuterVolumeSpecName: "kube-api-access-hbrzj") pod "cec4f3d6-d73a-4f74-acd4-f56658defc4a" (UID: "cec4f3d6-d73a-4f74-acd4-f56658defc4a"). InnerVolumeSpecName "kube-api-access-hbrzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.746299 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.746334 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbrzj\" (UniqueName: \"kubernetes.io/projected/cec4f3d6-d73a-4f74-acd4-f56658defc4a-kube-api-access-hbrzj\") on node \"crc\" DevicePath \"\"" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.814154 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cec4f3d6-d73a-4f74-acd4-f56658defc4a" (UID: "cec4f3d6-d73a-4f74-acd4-f56658defc4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.847905 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cec4f3d6-d73a-4f74-acd4-f56658defc4a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.937609 4749 generic.go:334] "Generic (PLEG): container finished" podID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerID="d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc" exitCode=0 Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.937652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerDied","Data":"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc"} Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.937682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nh72n" event={"ID":"cec4f3d6-d73a-4f74-acd4-f56658defc4a","Type":"ContainerDied","Data":"786d95f6202e5dff3e7bf6075d693ebe7c7fc1b7909c2b00473b19315c3a314b"} Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.937698 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nh72n" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.937705 4749 scope.go:117] "RemoveContainer" containerID="d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.956893 4749 scope.go:117] "RemoveContainer" containerID="2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.980749 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.989670 4749 scope.go:117] "RemoveContainer" containerID="b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1" Feb 25 08:04:10 crc kubenswrapper[4749]: I0225 08:04:10.990472 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nh72n"] Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.027265 4749 scope.go:117] "RemoveContainer" containerID="d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc" Feb 25 08:04:11 crc kubenswrapper[4749]: E0225 08:04:11.027729 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc\": container with ID starting with d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc not found: ID does not exist" containerID="d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.027764 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc"} err="failed to get container status \"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc\": rpc error: code = NotFound desc = could not find container \"d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc\": container with ID starting with d238cc13a97b9abf9d21a276ad8fccf3d63116054d18b12f129b89ffcce115fc not found: ID does not exist" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.027790 4749 scope.go:117] "RemoveContainer" containerID="2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee" Feb 25 08:04:11 crc kubenswrapper[4749]: E0225 08:04:11.028124 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee\": container with ID starting with 2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee not found: ID does not exist" containerID="2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.028146 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee"} err="failed to get container status \"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee\": rpc error: code = NotFound desc = could not find container \"2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee\": container with ID starting with 2886b8d29b0d6accf11db8a0fc0968fae640100afe089e8147424bdbd97920ee not found: ID does not exist" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.028162 4749 scope.go:117] "RemoveContainer" containerID="b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1" Feb 25 08:04:11 crc kubenswrapper[4749]: E0225 08:04:11.028374 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1\": container with ID starting with b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1 not found: ID does not exist" containerID="b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.028416 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1"} err="failed to get container status \"b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1\": rpc error: code = NotFound desc = could not find container \"b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1\": container with ID starting with b7e2f0bd9fefe5c2b4df6e7c0c51164bcf943fb75fb3335d35c8d9ad395dcbc1 not found: ID does not exist" Feb 25 08:04:11 crc kubenswrapper[4749]: I0225 08:04:11.348178 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" path="/var/lib/kubelet/pods/cec4f3d6-d73a-4f74-acd4-f56658defc4a/volumes" Feb 25 08:04:21 crc kubenswrapper[4749]: I0225 08:04:21.671438 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:04:21 crc kubenswrapper[4749]: I0225 08:04:21.672239 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:04:51 crc kubenswrapper[4749]: I0225 08:04:51.672194 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:04:51 crc kubenswrapper[4749]: I0225 08:04:51.672974 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:04:51 crc kubenswrapper[4749]: I0225 08:04:51.673055 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:04:51 crc kubenswrapper[4749]: I0225 08:04:51.674384 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:04:51 crc kubenswrapper[4749]: I0225 08:04:51.674519 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7" gracePeriod=600 Feb 25 08:04:52 crc kubenswrapper[4749]: I0225 08:04:52.477342 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7" exitCode=0 Feb 25 08:04:52 crc kubenswrapper[4749]: I0225 08:04:52.477446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7"} Feb 25 08:04:52 crc kubenswrapper[4749]: I0225 08:04:52.478148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9"} Feb 25 08:04:52 crc kubenswrapper[4749]: I0225 08:04:52.478191 4749 scope.go:117] "RemoveContainer" containerID="fdda4961bc0709da05a789a5097522a4550f46933aa53abfd6e8990c2a458a74" Feb 25 08:04:53 crc kubenswrapper[4749]: I0225 08:04:53.124892 4749 scope.go:117] "RemoveContainer" containerID="0d8c07bba520694800d7dac8c043f1e44e5e7eb54dba35dd6ce7e49463724141" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.937008 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:04:59 crc kubenswrapper[4749]: E0225 08:04:59.938175 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="extract-utilities" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938199 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="extract-utilities" Feb 25 08:04:59 crc kubenswrapper[4749]: E0225 08:04:59.938231 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="registry-server" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938243 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="registry-server" Feb 25 08:04:59 crc kubenswrapper[4749]: E0225 08:04:59.938284 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="extract-content" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938297 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="extract-content" Feb 25 08:04:59 crc kubenswrapper[4749]: E0225 08:04:59.938327 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81152f8c-3d00-497e-a709-71740dbc092c" containerName="oc" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938340 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81152f8c-3d00-497e-a709-71740dbc092c" containerName="oc" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938739 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cec4f3d6-d73a-4f74-acd4-f56658defc4a" containerName="registry-server" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.938780 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81152f8c-3d00-497e-a709-71740dbc092c" containerName="oc" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.941102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:04:59 crc kubenswrapper[4749]: I0225 08:04:59.948113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.031973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.032069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.032128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5nd\" (UniqueName: \"kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.133631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.133841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.133867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5nd\" (UniqueName: \"kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.134134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.134209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.152514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5nd\" (UniqueName: \"kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd\") pod \"redhat-marketplace-kb2pk\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.296875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:00 crc kubenswrapper[4749]: I0225 08:05:00.768114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:05:01 crc kubenswrapper[4749]: I0225 08:05:01.590812 4749 generic.go:334] "Generic (PLEG): container finished" podID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerID="02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa" exitCode=0 Feb 25 08:05:01 crc kubenswrapper[4749]: I0225 08:05:01.590892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerDied","Data":"02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa"} Feb 25 08:05:01 crc kubenswrapper[4749]: I0225 08:05:01.591380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerStarted","Data":"1fba1bc24e852661529e7e040c24d89ce428497c2ed33ed0202f500fcfcd0a18"} Feb 25 08:05:03 crc kubenswrapper[4749]: I0225 08:05:03.631529 4749 generic.go:334] "Generic (PLEG): container finished" podID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerID="4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c" exitCode=0 Feb 25 08:05:03 crc kubenswrapper[4749]: I0225 08:05:03.631637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerDied","Data":"4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c"} Feb 25 08:05:04 crc kubenswrapper[4749]: I0225 08:05:04.651476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerStarted","Data":"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32"} Feb 25 08:05:04 crc kubenswrapper[4749]: I0225 08:05:04.685330 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kb2pk" podStartSLOduration=3.245793419 podStartE2EDuration="5.685298023s" podCreationTimestamp="2026-02-25 08:04:59 +0000 UTC" firstStartedPulling="2026-02-25 08:05:01.596626016 +0000 UTC m=+2854.958452036" lastFinishedPulling="2026-02-25 08:05:04.03613058 +0000 UTC m=+2857.397956640" observedRunningTime="2026-02-25 08:05:04.671892979 +0000 UTC m=+2858.033719039" watchObservedRunningTime="2026-02-25 08:05:04.685298023 +0000 UTC m=+2858.047124083" Feb 25 08:05:10 crc kubenswrapper[4749]: I0225 08:05:10.297853 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:10 crc kubenswrapper[4749]: I0225 08:05:10.299365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:10 crc kubenswrapper[4749]: I0225 08:05:10.350520 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:10 crc kubenswrapper[4749]: I0225 08:05:10.805706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:10 crc kubenswrapper[4749]: I0225 08:05:10.882110 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:05:12 crc kubenswrapper[4749]: I0225 08:05:12.747229 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kb2pk" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="registry-server" containerID="cri-o://93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32" gracePeriod=2 Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.355912 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.422510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities\") pod \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.422586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content\") pod \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.422654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk5nd\" (UniqueName: \"kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd\") pod \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\" (UID: \"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8\") " Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.425804 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities" (OuterVolumeSpecName: "utilities") pod "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" (UID: "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.431990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd" (OuterVolumeSpecName: "kube-api-access-nk5nd") pod "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" (UID: "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8"). InnerVolumeSpecName "kube-api-access-nk5nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.432587 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.432641 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk5nd\" (UniqueName: \"kubernetes.io/projected/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-kube-api-access-nk5nd\") on node \"crc\" DevicePath \"\"" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.605634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" (UID: "527c7ee1-c7cf-471d-87ed-1e4b3565d7f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.636710 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.760094 4749 generic.go:334] "Generic (PLEG): container finished" podID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerID="93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32" exitCode=0 Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.760144 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerDied","Data":"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32"} Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.760190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kb2pk" event={"ID":"527c7ee1-c7cf-471d-87ed-1e4b3565d7f8","Type":"ContainerDied","Data":"1fba1bc24e852661529e7e040c24d89ce428497c2ed33ed0202f500fcfcd0a18"} Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.760215 4749 scope.go:117] "RemoveContainer" containerID="93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.760228 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kb2pk" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.791430 4749 scope.go:117] "RemoveContainer" containerID="4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.809646 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.817837 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kb2pk"] Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.840148 4749 scope.go:117] "RemoveContainer" containerID="02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.883649 4749 scope.go:117] "RemoveContainer" containerID="93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32" Feb 25 08:05:13 crc kubenswrapper[4749]: E0225 08:05:13.884222 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32\": container with ID starting with 93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32 not found: ID does not exist" containerID="93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.884254 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32"} err="failed to get container status \"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32\": rpc error: code = NotFound desc = could not find container \"93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32\": container with ID starting with 93eba0e1da0b81b50fef67bb9f62e17be497cbb29c7826e840e2f992187cbb32 not found: ID does not exist" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.884274 4749 scope.go:117] "RemoveContainer" containerID="4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c" Feb 25 08:05:13 crc kubenswrapper[4749]: E0225 08:05:13.885106 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c\": container with ID starting with 4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c not found: ID does not exist" containerID="4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.885128 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c"} err="failed to get container status \"4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c\": rpc error: code = NotFound desc = could not find container \"4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c\": container with ID starting with 4eaae866c0548f4dafc2bb5c851615d97446816f4803c28da1a4b9b43a7d218c not found: ID does not exist" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.885141 4749 scope.go:117] "RemoveContainer" containerID="02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa" Feb 25 08:05:13 crc kubenswrapper[4749]: E0225 08:05:13.885980 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa\": container with ID starting with 02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa not found: ID does not exist" containerID="02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa" Feb 25 08:05:13 crc kubenswrapper[4749]: I0225 08:05:13.886023 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa"} err="failed to get container status \"02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa\": rpc error: code = NotFound desc = could not find container \"02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa\": container with ID starting with 02c492d3163f75f45f9361b09cd5f2623e0061d7f42072de163afc04e723adfa not found: ID does not exist" Feb 25 08:05:15 crc kubenswrapper[4749]: I0225 08:05:15.334063 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" path="/var/lib/kubelet/pods/527c7ee1-c7cf-471d-87ed-1e4b3565d7f8/volumes" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.153315 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533446-zhnq7"] Feb 25 08:06:00 crc kubenswrapper[4749]: E0225 08:06:00.154502 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="extract-content" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.154520 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="extract-content" Feb 25 08:06:00 crc kubenswrapper[4749]: E0225 08:06:00.154535 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="registry-server" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.154543 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="registry-server" Feb 25 08:06:00 crc kubenswrapper[4749]: E0225 08:06:00.154576 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="extract-utilities" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.154585 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="extract-utilities" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.154859 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="527c7ee1-c7cf-471d-87ed-1e4b3565d7f8" containerName="registry-server" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.155674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.158194 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.158469 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.160195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.165149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533446-zhnq7"] Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.257325 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqf6\" (UniqueName: \"kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6\") pod \"auto-csr-approver-29533446-zhnq7\" (UID: \"1325fdca-732c-4117-a33f-4e5780feb63c\") " pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.360123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqf6\" (UniqueName: \"kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6\") pod \"auto-csr-approver-29533446-zhnq7\" (UID: \"1325fdca-732c-4117-a33f-4e5780feb63c\") " pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.388429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqf6\" (UniqueName: \"kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6\") pod \"auto-csr-approver-29533446-zhnq7\" (UID: \"1325fdca-732c-4117-a33f-4e5780feb63c\") " pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.473448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:00 crc kubenswrapper[4749]: I0225 08:06:00.987032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533446-zhnq7"] Feb 25 08:06:01 crc kubenswrapper[4749]: I0225 08:06:01.270850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" event={"ID":"1325fdca-732c-4117-a33f-4e5780feb63c","Type":"ContainerStarted","Data":"d362af0c194ac8d5512a9d3956f4d937c335a3ac08b5fb1063658b3fd88d712e"} Feb 25 08:06:02 crc kubenswrapper[4749]: I0225 08:06:02.281474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" event={"ID":"1325fdca-732c-4117-a33f-4e5780feb63c","Type":"ContainerStarted","Data":"98b76538865db82f56e5d9a4da2f4451c5ef8c4c7a8c5c046672133c4fb8ccdd"} Feb 25 08:06:02 crc kubenswrapper[4749]: I0225 08:06:02.301070 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" podStartSLOduration=1.5313143949999999 podStartE2EDuration="2.301050782s" podCreationTimestamp="2026-02-25 08:06:00 +0000 UTC" firstStartedPulling="2026-02-25 08:06:00.991090835 +0000 UTC m=+2914.352916855" lastFinishedPulling="2026-02-25 08:06:01.760827212 +0000 UTC m=+2915.122653242" observedRunningTime="2026-02-25 08:06:02.294396452 +0000 UTC m=+2915.656222472" watchObservedRunningTime="2026-02-25 08:06:02.301050782 +0000 UTC m=+2915.662876802" Feb 25 08:06:03 crc kubenswrapper[4749]: I0225 08:06:03.291886 4749 generic.go:334] "Generic (PLEG): container finished" podID="1325fdca-732c-4117-a33f-4e5780feb63c" containerID="98b76538865db82f56e5d9a4da2f4451c5ef8c4c7a8c5c046672133c4fb8ccdd" exitCode=0 Feb 25 08:06:03 crc kubenswrapper[4749]: I0225 08:06:03.291989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" event={"ID":"1325fdca-732c-4117-a33f-4e5780feb63c","Type":"ContainerDied","Data":"98b76538865db82f56e5d9a4da2f4451c5ef8c4c7a8c5c046672133c4fb8ccdd"} Feb 25 08:06:04 crc kubenswrapper[4749]: I0225 08:06:04.682232 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:04 crc kubenswrapper[4749]: I0225 08:06:04.744012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcqf6\" (UniqueName: \"kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6\") pod \"1325fdca-732c-4117-a33f-4e5780feb63c\" (UID: \"1325fdca-732c-4117-a33f-4e5780feb63c\") " Feb 25 08:06:04 crc kubenswrapper[4749]: I0225 08:06:04.754132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6" (OuterVolumeSpecName: "kube-api-access-rcqf6") pod "1325fdca-732c-4117-a33f-4e5780feb63c" (UID: "1325fdca-732c-4117-a33f-4e5780feb63c"). InnerVolumeSpecName "kube-api-access-rcqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:06:04 crc kubenswrapper[4749]: I0225 08:06:04.846641 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcqf6\" (UniqueName: \"kubernetes.io/projected/1325fdca-732c-4117-a33f-4e5780feb63c-kube-api-access-rcqf6\") on node \"crc\" DevicePath \"\"" Feb 25 08:06:05 crc kubenswrapper[4749]: I0225 08:06:05.313556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" event={"ID":"1325fdca-732c-4117-a33f-4e5780feb63c","Type":"ContainerDied","Data":"d362af0c194ac8d5512a9d3956f4d937c335a3ac08b5fb1063658b3fd88d712e"} Feb 25 08:06:05 crc kubenswrapper[4749]: I0225 08:06:05.313643 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533446-zhnq7" Feb 25 08:06:05 crc kubenswrapper[4749]: I0225 08:06:05.313660 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d362af0c194ac8d5512a9d3956f4d937c335a3ac08b5fb1063658b3fd88d712e" Feb 25 08:06:05 crc kubenswrapper[4749]: I0225 08:06:05.416941 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533440-gbrqs"] Feb 25 08:06:05 crc kubenswrapper[4749]: I0225 08:06:05.430338 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533440-gbrqs"] Feb 25 08:06:07 crc kubenswrapper[4749]: I0225 08:06:07.339208 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001307fc-d97a-42a9-91b4-1b0b9bb5ba4b" path="/var/lib/kubelet/pods/001307fc-d97a-42a9-91b4-1b0b9bb5ba4b/volumes" Feb 25 08:06:53 crc kubenswrapper[4749]: I0225 08:06:53.326645 4749 scope.go:117] "RemoveContainer" containerID="5c62684f9bd9fa546ef6e84b055dd05993c8816acbcc3720ec57f5fced79fa79" Feb 25 08:07:21 crc kubenswrapper[4749]: I0225 08:07:21.671724 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:07:21 crc kubenswrapper[4749]: I0225 08:07:21.672184 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:07:51 crc kubenswrapper[4749]: I0225 08:07:51.671768 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:07:51 crc kubenswrapper[4749]: I0225 08:07:51.672212 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.162213 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533448-bbw5b"] Feb 25 08:08:00 crc kubenswrapper[4749]: E0225 08:08:00.163108 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1325fdca-732c-4117-a33f-4e5780feb63c" containerName="oc" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.163120 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1325fdca-732c-4117-a33f-4e5780feb63c" containerName="oc" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.163303 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1325fdca-732c-4117-a33f-4e5780feb63c" containerName="oc" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.163970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.167769 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.168331 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.170967 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.171052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2z6\" (UniqueName: \"kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6\") pod \"auto-csr-approver-29533448-bbw5b\" (UID: \"da44cd5a-ceed-4771-a179-73a84ceefd45\") " pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.180730 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533448-bbw5b"] Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.273559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2z6\" (UniqueName: \"kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6\") pod \"auto-csr-approver-29533448-bbw5b\" (UID: \"da44cd5a-ceed-4771-a179-73a84ceefd45\") " pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.292018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2z6\" (UniqueName: \"kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6\") pod \"auto-csr-approver-29533448-bbw5b\" (UID: \"da44cd5a-ceed-4771-a179-73a84ceefd45\") " pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.489531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:00 crc kubenswrapper[4749]: I0225 08:08:00.949020 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533448-bbw5b"] Feb 25 08:08:01 crc kubenswrapper[4749]: I0225 08:08:01.672679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" event={"ID":"da44cd5a-ceed-4771-a179-73a84ceefd45","Type":"ContainerStarted","Data":"0ddf8b79feb5ad903eb4440f50d69f067aaad48e664ae8371da15d3bfcb2e57b"} Feb 25 08:08:02 crc kubenswrapper[4749]: I0225 08:08:02.698378 4749 generic.go:334] "Generic (PLEG): container finished" podID="da44cd5a-ceed-4771-a179-73a84ceefd45" containerID="566d2318ac32b3c311bfc20cbee3fa9492b2d2b293150d02e5efe161415b7d7c" exitCode=0 Feb 25 08:08:02 crc kubenswrapper[4749]: I0225 08:08:02.698428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" event={"ID":"da44cd5a-ceed-4771-a179-73a84ceefd45","Type":"ContainerDied","Data":"566d2318ac32b3c311bfc20cbee3fa9492b2d2b293150d02e5efe161415b7d7c"} Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.109423 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.268215 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2z6\" (UniqueName: \"kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6\") pod \"da44cd5a-ceed-4771-a179-73a84ceefd45\" (UID: \"da44cd5a-ceed-4771-a179-73a84ceefd45\") " Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.275772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6" (OuterVolumeSpecName: "kube-api-access-dh2z6") pod "da44cd5a-ceed-4771-a179-73a84ceefd45" (UID: "da44cd5a-ceed-4771-a179-73a84ceefd45"). InnerVolumeSpecName "kube-api-access-dh2z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.370414 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2z6\" (UniqueName: \"kubernetes.io/projected/da44cd5a-ceed-4771-a179-73a84ceefd45-kube-api-access-dh2z6\") on node \"crc\" DevicePath \"\"" Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.719695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" event={"ID":"da44cd5a-ceed-4771-a179-73a84ceefd45","Type":"ContainerDied","Data":"0ddf8b79feb5ad903eb4440f50d69f067aaad48e664ae8371da15d3bfcb2e57b"} Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.720470 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddf8b79feb5ad903eb4440f50d69f067aaad48e664ae8371da15d3bfcb2e57b" Feb 25 08:08:04 crc kubenswrapper[4749]: I0225 08:08:04.719813 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533448-bbw5b" Feb 25 08:08:05 crc kubenswrapper[4749]: I0225 08:08:05.203917 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533442-8qwtn"] Feb 25 08:08:05 crc kubenswrapper[4749]: I0225 08:08:05.211775 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533442-8qwtn"] Feb 25 08:08:05 crc kubenswrapper[4749]: I0225 08:08:05.336560 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11487c4-c25a-48ba-a642-03d6d05630fc" path="/var/lib/kubelet/pods/b11487c4-c25a-48ba-a642-03d6d05630fc/volumes" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.672294 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.672886 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.672934 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.673732 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.673792 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" gracePeriod=600 Feb 25 08:08:21 crc kubenswrapper[4749]: E0225 08:08:21.799417 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.897260 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" exitCode=0 Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.897300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9"} Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.897331 4749 scope.go:117] "RemoveContainer" containerID="bd3aeba354d41572ae843202571f8f61b189d8833ea6cdd1ee0d7b47c65abac7" Feb 25 08:08:21 crc kubenswrapper[4749]: I0225 08:08:21.897993 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:08:21 crc kubenswrapper[4749]: E0225 08:08:21.898285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:08:36 crc kubenswrapper[4749]: I0225 08:08:36.323098 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:08:36 crc kubenswrapper[4749]: E0225 08:08:36.325712 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:08:50 crc kubenswrapper[4749]: I0225 08:08:50.322454 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:08:50 crc kubenswrapper[4749]: E0225 08:08:50.323344 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:08:53 crc kubenswrapper[4749]: I0225 08:08:53.439980 4749 scope.go:117] "RemoveContainer" containerID="a6de9b850659488c84ca92c7c09ea9050b7377ebf7f52069ed917d0d3d3885f8" Feb 25 08:09:01 crc kubenswrapper[4749]: I0225 08:09:01.323439 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:09:01 crc kubenswrapper[4749]: E0225 08:09:01.324654 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:09:12 crc kubenswrapper[4749]: I0225 08:09:12.322565 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:09:12 crc kubenswrapper[4749]: E0225 08:09:12.323564 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:09:27 crc kubenswrapper[4749]: I0225 08:09:27.336135 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:09:27 crc kubenswrapper[4749]: E0225 08:09:27.337551 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.259340 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:31 crc kubenswrapper[4749]: E0225 08:09:31.260861 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da44cd5a-ceed-4771-a179-73a84ceefd45" containerName="oc" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.260894 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="da44cd5a-ceed-4771-a179-73a84ceefd45" containerName="oc" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.261384 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="da44cd5a-ceed-4771-a179-73a84ceefd45" containerName="oc" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.264873 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.280655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.363492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.363583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4qg\" (UniqueName: \"kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.363796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.465916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.466332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.466400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4qg\" (UniqueName: \"kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.466984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.467831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.492585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4qg\" (UniqueName: \"kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg\") pod \"community-operators-6bdlz\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:31 crc kubenswrapper[4749]: I0225 08:09:31.597913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:32 crc kubenswrapper[4749]: I0225 08:09:32.205553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:32 crc kubenswrapper[4749]: I0225 08:09:32.767780 4749 generic.go:334] "Generic (PLEG): container finished" podID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerID="1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec" exitCode=0 Feb 25 08:09:32 crc kubenswrapper[4749]: I0225 08:09:32.767892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerDied","Data":"1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec"} Feb 25 08:09:32 crc kubenswrapper[4749]: I0225 08:09:32.768200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerStarted","Data":"f489c71f71b90349d17b92bd23761b7e8724123070ca0679a1c09c4fa296ad6f"} Feb 25 08:09:32 crc kubenswrapper[4749]: I0225 08:09:32.771115 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:09:33 crc kubenswrapper[4749]: I0225 08:09:33.783657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerStarted","Data":"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb"} Feb 25 08:09:35 crc kubenswrapper[4749]: I0225 08:09:35.809464 4749 generic.go:334] "Generic (PLEG): container finished" podID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerID="962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb" exitCode=0 Feb 25 08:09:35 crc kubenswrapper[4749]: I0225 08:09:35.809554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerDied","Data":"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb"} Feb 25 08:09:36 crc kubenswrapper[4749]: I0225 08:09:36.822555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerStarted","Data":"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621"} Feb 25 08:09:36 crc kubenswrapper[4749]: I0225 08:09:36.884456 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bdlz" podStartSLOduration=2.433947237 podStartE2EDuration="5.884433018s" podCreationTimestamp="2026-02-25 08:09:31 +0000 UTC" firstStartedPulling="2026-02-25 08:09:32.770741972 +0000 UTC m=+3126.132568022" lastFinishedPulling="2026-02-25 08:09:36.221227743 +0000 UTC m=+3129.583053803" observedRunningTime="2026-02-25 08:09:36.85272307 +0000 UTC m=+3130.214549130" watchObservedRunningTime="2026-02-25 08:09:36.884433018 +0000 UTC m=+3130.246259048" Feb 25 08:09:41 crc kubenswrapper[4749]: I0225 08:09:41.323074 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:09:41 crc kubenswrapper[4749]: E0225 08:09:41.324394 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:09:41 crc kubenswrapper[4749]: I0225 08:09:41.598955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:41 crc kubenswrapper[4749]: I0225 08:09:41.599133 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:41 crc kubenswrapper[4749]: I0225 08:09:41.674724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:41 crc kubenswrapper[4749]: I0225 08:09:41.979831 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:42 crc kubenswrapper[4749]: I0225 08:09:42.052611 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:43 crc kubenswrapper[4749]: I0225 08:09:43.914046 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bdlz" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="registry-server" containerID="cri-o://a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621" gracePeriod=2 Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.539041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.666848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content\") pod \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.666975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq4qg\" (UniqueName: \"kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg\") pod \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.667080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities\") pod \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\" (UID: \"cbf87fbd-aa63-404b-8dd9-be73e54a8e95\") " Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.667952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities" (OuterVolumeSpecName: "utilities") pod "cbf87fbd-aa63-404b-8dd9-be73e54a8e95" (UID: "cbf87fbd-aa63-404b-8dd9-be73e54a8e95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.668516 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.678367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg" (OuterVolumeSpecName: "kube-api-access-fq4qg") pod "cbf87fbd-aa63-404b-8dd9-be73e54a8e95" (UID: "cbf87fbd-aa63-404b-8dd9-be73e54a8e95"). InnerVolumeSpecName "kube-api-access-fq4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.719933 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbf87fbd-aa63-404b-8dd9-be73e54a8e95" (UID: "cbf87fbd-aa63-404b-8dd9-be73e54a8e95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.770741 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.770946 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq4qg\" (UniqueName: \"kubernetes.io/projected/cbf87fbd-aa63-404b-8dd9-be73e54a8e95-kube-api-access-fq4qg\") on node \"crc\" DevicePath \"\"" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.925376 4749 generic.go:334] "Generic (PLEG): container finished" podID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerID="a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621" exitCode=0 Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.925433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerDied","Data":"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621"} Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.925477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bdlz" event={"ID":"cbf87fbd-aa63-404b-8dd9-be73e54a8e95","Type":"ContainerDied","Data":"f489c71f71b90349d17b92bd23761b7e8724123070ca0679a1c09c4fa296ad6f"} Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.925506 4749 scope.go:117] "RemoveContainer" containerID="a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.926477 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bdlz" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.952624 4749 scope.go:117] "RemoveContainer" containerID="962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb" Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.957762 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.965554 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bdlz"] Feb 25 08:09:44 crc kubenswrapper[4749]: I0225 08:09:44.979926 4749 scope.go:117] "RemoveContainer" containerID="1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.047020 4749 scope.go:117] "RemoveContainer" containerID="a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621" Feb 25 08:09:45 crc kubenswrapper[4749]: E0225 08:09:45.047549 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621\": container with ID starting with a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621 not found: ID does not exist" containerID="a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.047586 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621"} err="failed to get container status \"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621\": rpc error: code = NotFound desc = could not find container \"a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621\": container with ID starting with a5bbf0e8eaf39ebd73f84b9e8ec09564b6c0a5bec4ed696b17b158b01a248621 not found: ID does not exist" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.047676 4749 scope.go:117] "RemoveContainer" containerID="962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb" Feb 25 08:09:45 crc kubenswrapper[4749]: E0225 08:09:45.048125 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb\": container with ID starting with 962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb not found: ID does not exist" containerID="962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.048175 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb"} err="failed to get container status \"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb\": rpc error: code = NotFound desc = could not find container \"962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb\": container with ID starting with 962c52fe1542babfefae4650a563481eb6426e046d72622f1449e2d0648038cb not found: ID does not exist" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.048209 4749 scope.go:117] "RemoveContainer" containerID="1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec" Feb 25 08:09:45 crc kubenswrapper[4749]: E0225 08:09:45.048561 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec\": container with ID starting with 1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec not found: ID does not exist" containerID="1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.048619 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec"} err="failed to get container status \"1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec\": rpc error: code = NotFound desc = could not find container \"1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec\": container with ID starting with 1a7ec0f4cd55ee486ffc191bbc4bf8c3a6752f9829dc54ebdb415d6a617899ec not found: ID does not exist" Feb 25 08:09:45 crc kubenswrapper[4749]: I0225 08:09:45.337528 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" path="/var/lib/kubelet/pods/cbf87fbd-aa63-404b-8dd9-be73e54a8e95/volumes" Feb 25 08:09:56 crc kubenswrapper[4749]: I0225 08:09:56.322452 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:09:56 crc kubenswrapper[4749]: E0225 08:09:56.323334 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.164461 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533450-df5b6"] Feb 25 08:10:00 crc kubenswrapper[4749]: E0225 08:10:00.166820 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="extract-utilities" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.166850 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="extract-utilities" Feb 25 08:10:00 crc kubenswrapper[4749]: E0225 08:10:00.166889 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="extract-content" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.166899 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="extract-content" Feb 25 08:10:00 crc kubenswrapper[4749]: E0225 08:10:00.166920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="registry-server" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.166929 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="registry-server" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.167455 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf87fbd-aa63-404b-8dd9-be73e54a8e95" containerName="registry-server" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.169160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.173085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.173377 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.180236 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.183087 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533450-df5b6"] Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.284622 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qqz\" (UniqueName: \"kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz\") pod \"auto-csr-approver-29533450-df5b6\" (UID: \"e96b5f53-402f-41ab-842c-6fb3c75c88b1\") " pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.387502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qqz\" (UniqueName: \"kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz\") pod \"auto-csr-approver-29533450-df5b6\" (UID: \"e96b5f53-402f-41ab-842c-6fb3c75c88b1\") " pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.412785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qqz\" (UniqueName: \"kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz\") pod \"auto-csr-approver-29533450-df5b6\" (UID: \"e96b5f53-402f-41ab-842c-6fb3c75c88b1\") " pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.506775 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:00 crc kubenswrapper[4749]: I0225 08:10:00.978055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533450-df5b6"] Feb 25 08:10:01 crc kubenswrapper[4749]: I0225 08:10:01.087135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533450-df5b6" event={"ID":"e96b5f53-402f-41ab-842c-6fb3c75c88b1","Type":"ContainerStarted","Data":"4cd906d57893a7e9c960ac9b1f336ec4b1d539fefeed0bd5891a4b4a8d5a6ac3"} Feb 25 08:10:03 crc kubenswrapper[4749]: I0225 08:10:03.108727 4749 generic.go:334] "Generic (PLEG): container finished" podID="e96b5f53-402f-41ab-842c-6fb3c75c88b1" containerID="62bb5c79a0ab5592146ed35157de64dfbfb5aeb3ee3e5d9b3c7fab5d7062d4a1" exitCode=0 Feb 25 08:10:03 crc kubenswrapper[4749]: I0225 08:10:03.108809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533450-df5b6" event={"ID":"e96b5f53-402f-41ab-842c-6fb3c75c88b1","Type":"ContainerDied","Data":"62bb5c79a0ab5592146ed35157de64dfbfb5aeb3ee3e5d9b3c7fab5d7062d4a1"} Feb 25 08:10:04 crc kubenswrapper[4749]: I0225 08:10:04.569362 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:04 crc kubenswrapper[4749]: I0225 08:10:04.678001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8qqz\" (UniqueName: \"kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz\") pod \"e96b5f53-402f-41ab-842c-6fb3c75c88b1\" (UID: \"e96b5f53-402f-41ab-842c-6fb3c75c88b1\") " Feb 25 08:10:04 crc kubenswrapper[4749]: I0225 08:10:04.706801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz" (OuterVolumeSpecName: "kube-api-access-r8qqz") pod "e96b5f53-402f-41ab-842c-6fb3c75c88b1" (UID: "e96b5f53-402f-41ab-842c-6fb3c75c88b1"). InnerVolumeSpecName "kube-api-access-r8qqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:10:04 crc kubenswrapper[4749]: I0225 08:10:04.781868 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8qqz\" (UniqueName: \"kubernetes.io/projected/e96b5f53-402f-41ab-842c-6fb3c75c88b1-kube-api-access-r8qqz\") on node \"crc\" DevicePath \"\"" Feb 25 08:10:05 crc kubenswrapper[4749]: I0225 08:10:05.133494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533450-df5b6" event={"ID":"e96b5f53-402f-41ab-842c-6fb3c75c88b1","Type":"ContainerDied","Data":"4cd906d57893a7e9c960ac9b1f336ec4b1d539fefeed0bd5891a4b4a8d5a6ac3"} Feb 25 08:10:05 crc kubenswrapper[4749]: I0225 08:10:05.133543 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd906d57893a7e9c960ac9b1f336ec4b1d539fefeed0bd5891a4b4a8d5a6ac3" Feb 25 08:10:05 crc kubenswrapper[4749]: I0225 08:10:05.133575 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533450-df5b6" Feb 25 08:10:05 crc kubenswrapper[4749]: I0225 08:10:05.694009 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533444-ptnz2"] Feb 25 08:10:05 crc kubenswrapper[4749]: I0225 08:10:05.703111 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533444-ptnz2"] Feb 25 08:10:07 crc kubenswrapper[4749]: I0225 08:10:07.344369 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81152f8c-3d00-497e-a709-71740dbc092c" path="/var/lib/kubelet/pods/81152f8c-3d00-497e-a709-71740dbc092c/volumes" Feb 25 08:10:10 crc kubenswrapper[4749]: I0225 08:10:10.322541 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:10:10 crc kubenswrapper[4749]: E0225 08:10:10.323527 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:10:23 crc kubenswrapper[4749]: I0225 08:10:23.323153 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:10:23 crc kubenswrapper[4749]: E0225 08:10:23.324265 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:10:35 crc kubenswrapper[4749]: I0225 08:10:35.322297 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:10:35 crc kubenswrapper[4749]: E0225 08:10:35.323026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:10:49 crc kubenswrapper[4749]: I0225 08:10:49.323121 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:10:49 crc kubenswrapper[4749]: E0225 08:10:49.323928 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:10:53 crc kubenswrapper[4749]: I0225 08:10:53.575149 4749 scope.go:117] "RemoveContainer" containerID="4692b55371dbc5eb55e665e454bafe96f70100db05720deeeafc7fe996f603a3" Feb 25 08:11:02 crc kubenswrapper[4749]: I0225 08:11:02.322261 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:11:02 crc kubenswrapper[4749]: E0225 08:11:02.323205 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:11:15 crc kubenswrapper[4749]: I0225 08:11:15.323565 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:11:15 crc kubenswrapper[4749]: E0225 08:11:15.324734 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:11:30 crc kubenswrapper[4749]: I0225 08:11:30.322771 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:11:30 crc kubenswrapper[4749]: E0225 08:11:30.323762 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:11:41 crc kubenswrapper[4749]: I0225 08:11:41.322572 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:11:41 crc kubenswrapper[4749]: E0225 08:11:41.323999 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:11:53 crc kubenswrapper[4749]: I0225 08:11:53.323751 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:11:53 crc kubenswrapper[4749]: E0225 08:11:53.324867 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.175053 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533452-4nb79"] Feb 25 08:12:00 crc kubenswrapper[4749]: E0225 08:12:00.175845 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96b5f53-402f-41ab-842c-6fb3c75c88b1" containerName="oc" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.175856 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96b5f53-402f-41ab-842c-6fb3c75c88b1" containerName="oc" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.176014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96b5f53-402f-41ab-842c-6fb3c75c88b1" containerName="oc" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.176583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.180206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.181615 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.185553 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.186736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533452-4nb79"] Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.320003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8dm\" (UniqueName: \"kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm\") pod \"auto-csr-approver-29533452-4nb79\" (UID: \"267d7778-eb07-471d-a5f2-c9dc773b6e0f\") " pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.422254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8dm\" (UniqueName: \"kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm\") pod \"auto-csr-approver-29533452-4nb79\" (UID: \"267d7778-eb07-471d-a5f2-c9dc773b6e0f\") " pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.440280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8dm\" (UniqueName: \"kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm\") pod \"auto-csr-approver-29533452-4nb79\" (UID: \"267d7778-eb07-471d-a5f2-c9dc773b6e0f\") " pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.491625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:00 crc kubenswrapper[4749]: I0225 08:12:00.980446 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533452-4nb79"] Feb 25 08:12:01 crc kubenswrapper[4749]: I0225 08:12:01.361860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533452-4nb79" event={"ID":"267d7778-eb07-471d-a5f2-c9dc773b6e0f","Type":"ContainerStarted","Data":"8546800b52bfc74517aa5188b0fb1cbb886acb0e13121b4116d1fb7ee024917a"} Feb 25 08:12:02 crc kubenswrapper[4749]: I0225 08:12:02.373837 4749 generic.go:334] "Generic (PLEG): container finished" podID="267d7778-eb07-471d-a5f2-c9dc773b6e0f" containerID="8701b8d76f7d5a7012cbe3480c1c6a5a4bdafbdc4b0778dbfc54114f154f179c" exitCode=0 Feb 25 08:12:02 crc kubenswrapper[4749]: I0225 08:12:02.373915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533452-4nb79" event={"ID":"267d7778-eb07-471d-a5f2-c9dc773b6e0f","Type":"ContainerDied","Data":"8701b8d76f7d5a7012cbe3480c1c6a5a4bdafbdc4b0778dbfc54114f154f179c"} Feb 25 08:12:03 crc kubenswrapper[4749]: I0225 08:12:03.775675 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:03 crc kubenswrapper[4749]: I0225 08:12:03.882770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm8dm\" (UniqueName: \"kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm\") pod \"267d7778-eb07-471d-a5f2-c9dc773b6e0f\" (UID: \"267d7778-eb07-471d-a5f2-c9dc773b6e0f\") " Feb 25 08:12:03 crc kubenswrapper[4749]: I0225 08:12:03.890890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm" (OuterVolumeSpecName: "kube-api-access-jm8dm") pod "267d7778-eb07-471d-a5f2-c9dc773b6e0f" (UID: "267d7778-eb07-471d-a5f2-c9dc773b6e0f"). InnerVolumeSpecName "kube-api-access-jm8dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:12:03 crc kubenswrapper[4749]: I0225 08:12:03.985451 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm8dm\" (UniqueName: \"kubernetes.io/projected/267d7778-eb07-471d-a5f2-c9dc773b6e0f-kube-api-access-jm8dm\") on node \"crc\" DevicePath \"\"" Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.322639 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:12:04 crc kubenswrapper[4749]: E0225 08:12:04.322957 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.390124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533452-4nb79" event={"ID":"267d7778-eb07-471d-a5f2-c9dc773b6e0f","Type":"ContainerDied","Data":"8546800b52bfc74517aa5188b0fb1cbb886acb0e13121b4116d1fb7ee024917a"} Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.390176 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8546800b52bfc74517aa5188b0fb1cbb886acb0e13121b4116d1fb7ee024917a" Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.390245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533452-4nb79" Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.867651 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533446-zhnq7"] Feb 25 08:12:04 crc kubenswrapper[4749]: I0225 08:12:04.879935 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533446-zhnq7"] Feb 25 08:12:05 crc kubenswrapper[4749]: I0225 08:12:05.336112 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1325fdca-732c-4117-a33f-4e5780feb63c" path="/var/lib/kubelet/pods/1325fdca-732c-4117-a33f-4e5780feb63c/volumes" Feb 25 08:12:15 crc kubenswrapper[4749]: I0225 08:12:15.323260 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:12:15 crc kubenswrapper[4749]: E0225 08:12:15.324871 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:12:30 crc kubenswrapper[4749]: I0225 08:12:30.322815 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:12:30 crc kubenswrapper[4749]: E0225 08:12:30.323683 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:12:41 crc kubenswrapper[4749]: I0225 08:12:41.323040 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:12:41 crc kubenswrapper[4749]: E0225 08:12:41.323832 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:12:53 crc kubenswrapper[4749]: I0225 08:12:53.738116 4749 scope.go:117] "RemoveContainer" containerID="98b76538865db82f56e5d9a4da2f4451c5ef8c4c7a8c5c046672133c4fb8ccdd" Feb 25 08:12:56 crc kubenswrapper[4749]: I0225 08:12:56.322678 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:12:56 crc kubenswrapper[4749]: E0225 08:12:56.323534 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:13:11 crc kubenswrapper[4749]: I0225 08:13:11.322115 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:13:11 crc kubenswrapper[4749]: E0225 08:13:11.322902 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:13:26 crc kubenswrapper[4749]: I0225 08:13:26.322321 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:13:27 crc kubenswrapper[4749]: I0225 08:13:27.518678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18"} Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.174814 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533454-rnssz"] Feb 25 08:14:00 crc kubenswrapper[4749]: E0225 08:14:00.176095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267d7778-eb07-471d-a5f2-c9dc773b6e0f" containerName="oc" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.176123 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="267d7778-eb07-471d-a5f2-c9dc773b6e0f" containerName="oc" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.176645 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="267d7778-eb07-471d-a5f2-c9dc773b6e0f" containerName="oc" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.177756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.181003 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.181139 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.181222 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.191641 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533454-rnssz"] Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.295502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv64q\" (UniqueName: \"kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q\") pod \"auto-csr-approver-29533454-rnssz\" (UID: \"4e37364d-70a4-4d07-8fb8-18168563fc1e\") " pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.400364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv64q\" (UniqueName: \"kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q\") pod \"auto-csr-approver-29533454-rnssz\" (UID: \"4e37364d-70a4-4d07-8fb8-18168563fc1e\") " pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.424230 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv64q\" (UniqueName: \"kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q\") pod \"auto-csr-approver-29533454-rnssz\" (UID: \"4e37364d-70a4-4d07-8fb8-18168563fc1e\") " pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.498647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:00 crc kubenswrapper[4749]: I0225 08:14:00.982800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533454-rnssz"] Feb 25 08:14:01 crc kubenswrapper[4749]: I0225 08:14:01.287678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533454-rnssz" event={"ID":"4e37364d-70a4-4d07-8fb8-18168563fc1e","Type":"ContainerStarted","Data":"ddbe31e6ec304a376cbacd2bcfa727009dddc4e5710a82f58b3daf972d66abd3"} Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.298482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533454-rnssz" event={"ID":"4e37364d-70a4-4d07-8fb8-18168563fc1e","Type":"ContainerStarted","Data":"461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1"} Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.311953 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29533454-rnssz" podStartSLOduration=1.39924447 podStartE2EDuration="2.311932044s" podCreationTimestamp="2026-02-25 08:14:00 +0000 UTC" firstStartedPulling="2026-02-25 08:14:00.988542087 +0000 UTC m=+3394.350368107" lastFinishedPulling="2026-02-25 08:14:01.901229641 +0000 UTC m=+3395.263055681" observedRunningTime="2026-02-25 08:14:02.310323105 +0000 UTC m=+3395.672149135" watchObservedRunningTime="2026-02-25 08:14:02.311932044 +0000 UTC m=+3395.673758064" Feb 25 08:14:02 crc kubenswrapper[4749]: E0225 08:14:02.752665 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e37364d_70a4_4d07_8fb8_18168563fc1e.slice/crio-461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e37364d_70a4_4d07_8fb8_18168563fc1e.slice/crio-conmon-461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1.scope\": RecentStats: unable to find data in memory cache]" Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.798127 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.803913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.806434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.969314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.969385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2pjv\" (UniqueName: \"kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:02 crc kubenswrapper[4749]: I0225 08:14:02.969606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.071121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.071446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2pjv\" (UniqueName: \"kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.071511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.071629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.071879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.105298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2pjv\" (UniqueName: \"kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv\") pod \"certified-operators-f6l2f\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.130249 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.346268 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e37364d-70a4-4d07-8fb8-18168563fc1e" containerID="461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1" exitCode=0 Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.346317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533454-rnssz" event={"ID":"4e37364d-70a4-4d07-8fb8-18168563fc1e","Type":"ContainerDied","Data":"461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1"} Feb 25 08:14:03 crc kubenswrapper[4749]: I0225 08:14:03.702437 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.359330 4749 generic.go:334] "Generic (PLEG): container finished" podID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerID="b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2" exitCode=0 Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.359400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerDied","Data":"b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2"} Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.359790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerStarted","Data":"49c621214a6b251a6226afcaed993444e82c8cad8ae478677bf8ceb332048b05"} Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.843324 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.915506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv64q\" (UniqueName: \"kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q\") pod \"4e37364d-70a4-4d07-8fb8-18168563fc1e\" (UID: \"4e37364d-70a4-4d07-8fb8-18168563fc1e\") " Feb 25 08:14:04 crc kubenswrapper[4749]: I0225 08:14:04.922916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q" (OuterVolumeSpecName: "kube-api-access-hv64q") pod "4e37364d-70a4-4d07-8fb8-18168563fc1e" (UID: "4e37364d-70a4-4d07-8fb8-18168563fc1e"). InnerVolumeSpecName "kube-api-access-hv64q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.017899 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv64q\" (UniqueName: \"kubernetes.io/projected/4e37364d-70a4-4d07-8fb8-18168563fc1e-kube-api-access-hv64q\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.375663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerStarted","Data":"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad"} Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.378316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533454-rnssz" event={"ID":"4e37364d-70a4-4d07-8fb8-18168563fc1e","Type":"ContainerDied","Data":"ddbe31e6ec304a376cbacd2bcfa727009dddc4e5710a82f58b3daf972d66abd3"} Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.378354 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbe31e6ec304a376cbacd2bcfa727009dddc4e5710a82f58b3daf972d66abd3" Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.378406 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533454-rnssz" Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.395568 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533448-bbw5b"] Feb 25 08:14:05 crc kubenswrapper[4749]: I0225 08:14:05.406579 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533448-bbw5b"] Feb 25 08:14:06 crc kubenswrapper[4749]: I0225 08:14:06.394103 4749 generic.go:334] "Generic (PLEG): container finished" podID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerID="c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad" exitCode=0 Feb 25 08:14:06 crc kubenswrapper[4749]: I0225 08:14:06.394145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerDied","Data":"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad"} Feb 25 08:14:07 crc kubenswrapper[4749]: I0225 08:14:07.341520 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da44cd5a-ceed-4771-a179-73a84ceefd45" path="/var/lib/kubelet/pods/da44cd5a-ceed-4771-a179-73a84ceefd45/volumes" Feb 25 08:14:07 crc kubenswrapper[4749]: I0225 08:14:07.406464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerStarted","Data":"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31"} Feb 25 08:14:07 crc kubenswrapper[4749]: I0225 08:14:07.439587 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f6l2f" podStartSLOduration=2.999882764 podStartE2EDuration="5.439569336s" podCreationTimestamp="2026-02-25 08:14:02 +0000 UTC" firstStartedPulling="2026-02-25 08:14:04.361241494 +0000 UTC m=+3397.723067514" lastFinishedPulling="2026-02-25 08:14:06.800928066 +0000 UTC m=+3400.162754086" observedRunningTime="2026-02-25 08:14:07.429891372 +0000 UTC m=+3400.791717402" watchObservedRunningTime="2026-02-25 08:14:07.439569336 +0000 UTC m=+3400.801395356" Feb 25 08:14:13 crc kubenswrapper[4749]: I0225 08:14:13.130908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:13 crc kubenswrapper[4749]: I0225 08:14:13.131378 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:14 crc kubenswrapper[4749]: I0225 08:14:14.194138 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-f6l2f" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="registry-server" probeResult="failure" output=< Feb 25 08:14:14 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 08:14:14 crc kubenswrapper[4749]: > Feb 25 08:14:23 crc kubenswrapper[4749]: I0225 08:14:23.198191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:23 crc kubenswrapper[4749]: I0225 08:14:23.263031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:23 crc kubenswrapper[4749]: I0225 08:14:23.448030 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:24 crc kubenswrapper[4749]: I0225 08:14:24.593610 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ffbba01-9e0c-4754-a378-68eaba4c858e" containerID="3c695c1d346e1e87ef6ef8437acb69045ea5e0aa45604ee313f38a60d669b127" exitCode=0 Feb 25 08:14:24 crc kubenswrapper[4749]: I0225 08:14:24.593645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ffbba01-9e0c-4754-a378-68eaba4c858e","Type":"ContainerDied","Data":"3c695c1d346e1e87ef6ef8437acb69045ea5e0aa45604ee313f38a60d669b127"} Feb 25 08:14:24 crc kubenswrapper[4749]: I0225 08:14:24.594128 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f6l2f" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="registry-server" containerID="cri-o://00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31" gracePeriod=2 Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.083828 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.136232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content\") pod \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.136380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities\") pod \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.136726 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2pjv\" (UniqueName: \"kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv\") pod \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\" (UID: \"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf\") " Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.137321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities" (OuterVolumeSpecName: "utilities") pod "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" (UID: "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.142896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv" (OuterVolumeSpecName: "kube-api-access-b2pjv") pod "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" (UID: "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf"). InnerVolumeSpecName "kube-api-access-b2pjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.185898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" (UID: "61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.239626 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2pjv\" (UniqueName: \"kubernetes.io/projected/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-kube-api-access-b2pjv\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.239692 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.239705 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.605434 4749 generic.go:334] "Generic (PLEG): container finished" podID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerID="00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31" exitCode=0 Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.605523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerDied","Data":"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31"} Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.605657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f6l2f" event={"ID":"61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf","Type":"ContainerDied","Data":"49c621214a6b251a6226afcaed993444e82c8cad8ae478677bf8ceb332048b05"} Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.605534 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f6l2f" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.605689 4749 scope.go:117] "RemoveContainer" containerID="00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.634696 4749 scope.go:117] "RemoveContainer" containerID="c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.637400 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.649368 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f6l2f"] Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.669459 4749 scope.go:117] "RemoveContainer" containerID="b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.726586 4749 scope.go:117] "RemoveContainer" containerID="00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31" Feb 25 08:14:25 crc kubenswrapper[4749]: E0225 08:14:25.728212 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31\": container with ID starting with 00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31 not found: ID does not exist" containerID="00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.728253 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31"} err="failed to get container status \"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31\": rpc error: code = NotFound desc = could not find container \"00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31\": container with ID starting with 00a5ad0828c7cc01dd3647446945166720d291735df01fc36b9b298343abfc31 not found: ID does not exist" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.728280 4749 scope.go:117] "RemoveContainer" containerID="c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad" Feb 25 08:14:25 crc kubenswrapper[4749]: E0225 08:14:25.728818 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad\": container with ID starting with c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad not found: ID does not exist" containerID="c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.728860 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad"} err="failed to get container status \"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad\": rpc error: code = NotFound desc = could not find container \"c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad\": container with ID starting with c89ca3a4596cc977f048bd6c43b67e95c777dab72d8d338b31e0b0a0b66a77ad not found: ID does not exist" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.728893 4749 scope.go:117] "RemoveContainer" containerID="b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2" Feb 25 08:14:25 crc kubenswrapper[4749]: E0225 08:14:25.729461 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2\": container with ID starting with b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2 not found: ID does not exist" containerID="b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2" Feb 25 08:14:25 crc kubenswrapper[4749]: I0225 08:14:25.729507 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2"} err="failed to get container status \"b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2\": rpc error: code = NotFound desc = could not find container \"b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2\": container with ID starting with b5b57a529ee591f4deb1d41dd22f8f476d69ed1825d8cf0270c163dc741332e2 not found: ID does not exist" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.040621 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.156956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157010 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42l8h\" (UniqueName: \"kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.157984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.158165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data" (OuterVolumeSpecName: "config-data") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.158281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.158302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key\") pod \"6ffbba01-9e0c-4754-a378-68eaba4c858e\" (UID: \"6ffbba01-9e0c-4754-a378-68eaba4c858e\") " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.159058 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.159071 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-config-data\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.162544 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.169314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.169969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h" (OuterVolumeSpecName: "kube-api-access-42l8h") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "kube-api-access-42l8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.187267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.209241 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.209640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.211235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6ffbba01-9e0c-4754-a378-68eaba4c858e" (UID: "6ffbba01-9e0c-4754-a378-68eaba4c858e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261238 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261297 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ffbba01-9e0c-4754-a378-68eaba4c858e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261318 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42l8h\" (UniqueName: \"kubernetes.io/projected/6ffbba01-9e0c-4754-a378-68eaba4c858e-kube-api-access-42l8h\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261339 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261389 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261408 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.261428 4749 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ffbba01-9e0c-4754-a378-68eaba4c858e-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.280074 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.364782 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.623296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ffbba01-9e0c-4754-a378-68eaba4c858e","Type":"ContainerDied","Data":"c5fbae8c632b91847dffe386009e2424fa51cdd582e6a9d232a64089f59a43e7"} Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.623347 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 25 08:14:26 crc kubenswrapper[4749]: I0225 08:14:26.623348 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5fbae8c632b91847dffe386009e2424fa51cdd582e6a9d232a64089f59a43e7" Feb 25 08:14:27 crc kubenswrapper[4749]: I0225 08:14:27.341067 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" path="/var/lib/kubelet/pods/61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf/volumes" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.101534 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 08:14:35 crc kubenswrapper[4749]: E0225 08:14:35.103125 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="registry-server" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="registry-server" Feb 25 08:14:35 crc kubenswrapper[4749]: E0225 08:14:35.103198 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="extract-content" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103216 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="extract-content" Feb 25 08:14:35 crc kubenswrapper[4749]: E0225 08:14:35.103251 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="extract-utilities" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103271 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="extract-utilities" Feb 25 08:14:35 crc kubenswrapper[4749]: E0225 08:14:35.103297 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e37364d-70a4-4d07-8fb8-18168563fc1e" containerName="oc" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103313 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e37364d-70a4-4d07-8fb8-18168563fc1e" containerName="oc" Feb 25 08:14:35 crc kubenswrapper[4749]: E0225 08:14:35.103357 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffbba01-9e0c-4754-a378-68eaba4c858e" containerName="tempest-tests-tempest-tests-runner" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103374 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffbba01-9e0c-4754-a378-68eaba4c858e" containerName="tempest-tests-tempest-tests-runner" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103871 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f6df9f-cfa6-4c5d-8a74-bd3c96e39eaf" containerName="registry-server" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103909 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e37364d-70a4-4d07-8fb8-18168563fc1e" containerName="oc" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.103934 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffbba01-9e0c-4754-a378-68eaba4c858e" containerName="tempest-tests-tempest-tests-runner" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.105951 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.108659 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sdgfx" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.121009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.168635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.168972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbqjr\" (UniqueName: \"kubernetes.io/projected/262268cf-ce8f-4780-9acc-a642f473b902-kube-api-access-rbqjr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.270627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbqjr\" (UniqueName: \"kubernetes.io/projected/262268cf-ce8f-4780-9acc-a642f473b902-kube-api-access-rbqjr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.270881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.271552 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.291094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbqjr\" (UniqueName: \"kubernetes.io/projected/262268cf-ce8f-4780-9acc-a642f473b902-kube-api-access-rbqjr\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.310137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"262268cf-ce8f-4780-9acc-a642f473b902\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:35 crc kubenswrapper[4749]: I0225 08:14:35.441625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 25 08:14:36 crc kubenswrapper[4749]: I0225 08:14:36.214144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 25 08:14:36 crc kubenswrapper[4749]: I0225 08:14:36.222715 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:14:36 crc kubenswrapper[4749]: I0225 08:14:36.744195 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"262268cf-ce8f-4780-9acc-a642f473b902","Type":"ContainerStarted","Data":"d80f0b0504a8db2ed02a7a45f35f70d010863cd1024335ad7beef8822589cc48"} Feb 25 08:14:37 crc kubenswrapper[4749]: I0225 08:14:37.758147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"262268cf-ce8f-4780-9acc-a642f473b902","Type":"ContainerStarted","Data":"6f363cd63cb12c01005197c82c501b9e7c7aac8e135a725c1a2d097ba69a1ac6"} Feb 25 08:14:37 crc kubenswrapper[4749]: I0225 08:14:37.791910 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.865004962 podStartE2EDuration="2.791889431s" podCreationTimestamp="2026-02-25 08:14:35 +0000 UTC" firstStartedPulling="2026-02-25 08:14:36.222290503 +0000 UTC m=+3429.584116563" lastFinishedPulling="2026-02-25 08:14:37.149175002 +0000 UTC m=+3430.511001032" observedRunningTime="2026-02-25 08:14:37.779370498 +0000 UTC m=+3431.141196548" watchObservedRunningTime="2026-02-25 08:14:37.791889431 +0000 UTC m=+3431.153715461" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.675126 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.678217 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.690196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.776760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.776845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.776906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.879209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.879448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.879630 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.880121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.880244 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:43 crc kubenswrapper[4749]: I0225 08:14:43.907561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts\") pod \"redhat-operators-xmwc2\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:44 crc kubenswrapper[4749]: I0225 08:14:44.030621 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:44 crc kubenswrapper[4749]: I0225 08:14:44.514736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:14:44 crc kubenswrapper[4749]: I0225 08:14:44.866295 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerID="f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941" exitCode=0 Feb 25 08:14:44 crc kubenswrapper[4749]: I0225 08:14:44.866490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerDied","Data":"f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941"} Feb 25 08:14:44 crc kubenswrapper[4749]: I0225 08:14:44.866951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerStarted","Data":"c131796cbdfa9bf26dc18128edc81bd17496a9ba693ac328688153a8767257af"} Feb 25 08:14:46 crc kubenswrapper[4749]: I0225 08:14:46.889651 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerID="33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0" exitCode=0 Feb 25 08:14:46 crc kubenswrapper[4749]: I0225 08:14:46.889742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerDied","Data":"33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0"} Feb 25 08:14:47 crc kubenswrapper[4749]: I0225 08:14:47.903237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerStarted","Data":"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a"} Feb 25 08:14:47 crc kubenswrapper[4749]: I0225 08:14:47.930213 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xmwc2" podStartSLOduration=2.393951996 podStartE2EDuration="4.930195334s" podCreationTimestamp="2026-02-25 08:14:43 +0000 UTC" firstStartedPulling="2026-02-25 08:14:44.870542435 +0000 UTC m=+3438.232368455" lastFinishedPulling="2026-02-25 08:14:47.406785773 +0000 UTC m=+3440.768611793" observedRunningTime="2026-02-25 08:14:47.926121155 +0000 UTC m=+3441.287947195" watchObservedRunningTime="2026-02-25 08:14:47.930195334 +0000 UTC m=+3441.292021354" Feb 25 08:14:53 crc kubenswrapper[4749]: I0225 08:14:53.847756 4749 scope.go:117] "RemoveContainer" containerID="566d2318ac32b3c311bfc20cbee3fa9492b2d2b293150d02e5efe161415b7d7c" Feb 25 08:14:54 crc kubenswrapper[4749]: I0225 08:14:54.038052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:54 crc kubenswrapper[4749]: I0225 08:14:54.038446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:14:55 crc kubenswrapper[4749]: I0225 08:14:55.096123 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xmwc2" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="registry-server" probeResult="failure" output=< Feb 25 08:14:55 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 08:14:55 crc kubenswrapper[4749]: > Feb 25 08:14:57 crc kubenswrapper[4749]: I0225 08:14:57.986709 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5mdj/must-gather-tq4w6"] Feb 25 08:14:57 crc kubenswrapper[4749]: I0225 08:14:57.988609 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:57 crc kubenswrapper[4749]: I0225 08:14:57.998232 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z5mdj"/"openshift-service-ca.crt" Feb 25 08:14:57 crc kubenswrapper[4749]: I0225 08:14:57.998389 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z5mdj"/"kube-root-ca.crt" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.015635 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5mdj/must-gather-tq4w6"] Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.089680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.089827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdwv\" (UniqueName: \"kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.191446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdwv\" (UniqueName: \"kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.191569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.191954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.208241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdwv\" (UniqueName: \"kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv\") pod \"must-gather-tq4w6\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.332411 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:14:58 crc kubenswrapper[4749]: W0225 08:14:58.844212 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod420840d3_bf89_4f03_85b0_924e79badf6a.slice/crio-0ae45a9eef933f224a1ba479a4d16f4f4550cf25f17a878034970cd6f5d40b03 WatchSource:0}: Error finding container 0ae45a9eef933f224a1ba479a4d16f4f4550cf25f17a878034970cd6f5d40b03: Status 404 returned error can't find the container with id 0ae45a9eef933f224a1ba479a4d16f4f4550cf25f17a878034970cd6f5d40b03 Feb 25 08:14:58 crc kubenswrapper[4749]: I0225 08:14:58.846330 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z5mdj/must-gather-tq4w6"] Feb 25 08:14:59 crc kubenswrapper[4749]: I0225 08:14:59.070170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" event={"ID":"420840d3-bf89-4f03-85b0-924e79badf6a","Type":"ContainerStarted","Data":"0ae45a9eef933f224a1ba479a4d16f4f4550cf25f17a878034970cd6f5d40b03"} Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.161142 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr"] Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.162636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.164474 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.166226 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.171551 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr"] Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.333164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96qm4\" (UniqueName: \"kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.333215 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.334017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.436943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.437062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96qm4\" (UniqueName: \"kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.437087 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.438060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.451050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.455481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96qm4\" (UniqueName: \"kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4\") pod \"collect-profiles-29533455-kt8dr\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.492435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:00 crc kubenswrapper[4749]: I0225 08:15:00.937831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr"] Feb 25 08:15:01 crc kubenswrapper[4749]: I0225 08:15:01.100467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" event={"ID":"632695ce-4647-4a31-b697-6ee6448d6649","Type":"ContainerStarted","Data":"6bb030bdc9a350bebc8d182f59f7f81996749619856bde258c85bb1da1a8e96e"} Feb 25 08:15:02 crc kubenswrapper[4749]: I0225 08:15:02.118240 4749 generic.go:334] "Generic (PLEG): container finished" podID="632695ce-4647-4a31-b697-6ee6448d6649" containerID="6e421eb9dc1264c08d0e2b05921275e9a8f8fcaa0977bc71f88639ea8f0958f2" exitCode=0 Feb 25 08:15:02 crc kubenswrapper[4749]: I0225 08:15:02.118303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" event={"ID":"632695ce-4647-4a31-b697-6ee6448d6649","Type":"ContainerDied","Data":"6e421eb9dc1264c08d0e2b05921275e9a8f8fcaa0977bc71f88639ea8f0958f2"} Feb 25 08:15:04 crc kubenswrapper[4749]: I0225 08:15:04.101271 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:15:04 crc kubenswrapper[4749]: I0225 08:15:04.148971 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.225398 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.225898 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xmwc2" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="registry-server" containerID="cri-o://476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a" gracePeriod=2 Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.563422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.652701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume\") pod \"632695ce-4647-4a31-b697-6ee6448d6649\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.653198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96qm4\" (UniqueName: \"kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4\") pod \"632695ce-4647-4a31-b697-6ee6448d6649\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.653248 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume\") pod \"632695ce-4647-4a31-b697-6ee6448d6649\" (UID: \"632695ce-4647-4a31-b697-6ee6448d6649\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.654309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume" (OuterVolumeSpecName: "config-volume") pod "632695ce-4647-4a31-b697-6ee6448d6649" (UID: "632695ce-4647-4a31-b697-6ee6448d6649"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.659523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4" (OuterVolumeSpecName: "kube-api-access-96qm4") pod "632695ce-4647-4a31-b697-6ee6448d6649" (UID: "632695ce-4647-4a31-b697-6ee6448d6649"). InnerVolumeSpecName "kube-api-access-96qm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.660615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "632695ce-4647-4a31-b697-6ee6448d6649" (UID: "632695ce-4647-4a31-b697-6ee6448d6649"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.755171 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/632695ce-4647-4a31-b697-6ee6448d6649-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.755195 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96qm4\" (UniqueName: \"kubernetes.io/projected/632695ce-4647-4a31-b697-6ee6448d6649-kube-api-access-96qm4\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.755206 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/632695ce-4647-4a31-b697-6ee6448d6649-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.764037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.856353 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities\") pod \"4e95229b-b2d8-458b-a88d-6d6768d29d36\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.856641 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts\") pod \"4e95229b-b2d8-458b-a88d-6d6768d29d36\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.856946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content\") pod \"4e95229b-b2d8-458b-a88d-6d6768d29d36\" (UID: \"4e95229b-b2d8-458b-a88d-6d6768d29d36\") " Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.857107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities" (OuterVolumeSpecName: "utilities") pod "4e95229b-b2d8-458b-a88d-6d6768d29d36" (UID: "4e95229b-b2d8-458b-a88d-6d6768d29d36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.857612 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.860776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts" (OuterVolumeSpecName: "kube-api-access-ltsts") pod "4e95229b-b2d8-458b-a88d-6d6768d29d36" (UID: "4e95229b-b2d8-458b-a88d-6d6768d29d36"). InnerVolumeSpecName "kube-api-access-ltsts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.961223 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltsts\" (UniqueName: \"kubernetes.io/projected/4e95229b-b2d8-458b-a88d-6d6768d29d36-kube-api-access-ltsts\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:05 crc kubenswrapper[4749]: I0225 08:15:05.969453 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e95229b-b2d8-458b-a88d-6d6768d29d36" (UID: "4e95229b-b2d8-458b-a88d-6d6768d29d36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.062461 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e95229b-b2d8-458b-a88d-6d6768d29d36-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.174307 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerID="476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a" exitCode=0 Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.174472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerDied","Data":"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a"} Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.174506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xmwc2" event={"ID":"4e95229b-b2d8-458b-a88d-6d6768d29d36","Type":"ContainerDied","Data":"c131796cbdfa9bf26dc18128edc81bd17496a9ba693ac328688153a8767257af"} Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.174534 4749 scope.go:117] "RemoveContainer" containerID="476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.174785 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xmwc2" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.181900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" event={"ID":"420840d3-bf89-4f03-85b0-924e79badf6a","Type":"ContainerStarted","Data":"07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296"} Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.181945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" event={"ID":"420840d3-bf89-4f03-85b0-924e79badf6a","Type":"ContainerStarted","Data":"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153"} Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.187942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" event={"ID":"632695ce-4647-4a31-b697-6ee6448d6649","Type":"ContainerDied","Data":"6bb030bdc9a350bebc8d182f59f7f81996749619856bde258c85bb1da1a8e96e"} Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.188043 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb030bdc9a350bebc8d182f59f7f81996749619856bde258c85bb1da1a8e96e" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.188228 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533455-kt8dr" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.202522 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" podStartSLOduration=2.5429022 podStartE2EDuration="9.202505798s" podCreationTimestamp="2026-02-25 08:14:57 +0000 UTC" firstStartedPulling="2026-02-25 08:14:58.847059865 +0000 UTC m=+3452.208885885" lastFinishedPulling="2026-02-25 08:15:05.506663463 +0000 UTC m=+3458.868489483" observedRunningTime="2026-02-25 08:15:06.201372082 +0000 UTC m=+3459.563198112" watchObservedRunningTime="2026-02-25 08:15:06.202505798 +0000 UTC m=+3459.564331818" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.207810 4749 scope.go:117] "RemoveContainer" containerID="33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.228366 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.238471 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xmwc2"] Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.249331 4749 scope.go:117] "RemoveContainer" containerID="f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.301639 4749 scope.go:117] "RemoveContainer" containerID="476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a" Feb 25 08:15:06 crc kubenswrapper[4749]: E0225 08:15:06.302423 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a\": container with ID starting with 476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a not found: ID does not exist" containerID="476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.302540 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a"} err="failed to get container status \"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a\": rpc error: code = NotFound desc = could not find container \"476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a\": container with ID starting with 476801e394a97884f8bce6f451567948ee436a63e69e31a8aa282bb931977f7a not found: ID does not exist" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.302669 4749 scope.go:117] "RemoveContainer" containerID="33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0" Feb 25 08:15:06 crc kubenswrapper[4749]: E0225 08:15:06.303352 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0\": container with ID starting with 33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0 not found: ID does not exist" containerID="33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.303383 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0"} err="failed to get container status \"33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0\": rpc error: code = NotFound desc = could not find container \"33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0\": container with ID starting with 33e607e635f12b637549b8c4ed0bf2e645f52ffa30d69d1311cf71e55d8a13f0 not found: ID does not exist" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.303403 4749 scope.go:117] "RemoveContainer" containerID="f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941" Feb 25 08:15:06 crc kubenswrapper[4749]: E0225 08:15:06.303880 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941\": container with ID starting with f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941 not found: ID does not exist" containerID="f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.303908 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941"} err="failed to get container status \"f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941\": rpc error: code = NotFound desc = could not find container \"f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941\": container with ID starting with f628d77dcac625f405bc79c2a7b6fbd1e8aa65da9fd85a180ed5c2462bd1e941 not found: ID does not exist" Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.664377 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd"] Feb 25 08:15:06 crc kubenswrapper[4749]: I0225 08:15:06.672378 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533410-5blcd"] Feb 25 08:15:07 crc kubenswrapper[4749]: I0225 08:15:07.334206 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" path="/var/lib/kubelet/pods/4e95229b-b2d8-458b-a88d-6d6768d29d36/volumes" Feb 25 08:15:07 crc kubenswrapper[4749]: I0225 08:15:07.335352 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def4ca88-4fde-45f4-a10f-b0bd66600b5e" path="/var/lib/kubelet/pods/def4ca88-4fde-45f4-a10f-b0bd66600b5e/volumes" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.548214 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-v9qt9"] Feb 25 08:15:09 crc kubenswrapper[4749]: E0225 08:15:09.548888 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="extract-content" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.548901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="extract-content" Feb 25 08:15:09 crc kubenswrapper[4749]: E0225 08:15:09.548909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="extract-utilities" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.548915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="extract-utilities" Feb 25 08:15:09 crc kubenswrapper[4749]: E0225 08:15:09.548925 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="registry-server" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.548932 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="registry-server" Feb 25 08:15:09 crc kubenswrapper[4749]: E0225 08:15:09.548946 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632695ce-4647-4a31-b697-6ee6448d6649" containerName="collect-profiles" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.548952 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="632695ce-4647-4a31-b697-6ee6448d6649" containerName="collect-profiles" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.549120 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e95229b-b2d8-458b-a88d-6d6768d29d36" containerName="registry-server" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.549148 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="632695ce-4647-4a31-b697-6ee6448d6649" containerName="collect-profiles" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.549913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.553134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z5mdj"/"default-dockercfg-d7t76" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.641915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.642080 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbw77\" (UniqueName: \"kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.744038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.744152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbw77\" (UniqueName: \"kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.744205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.779211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbw77\" (UniqueName: \"kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77\") pod \"crc-debug-v9qt9\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: I0225 08:15:09.869153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:09 crc kubenswrapper[4749]: W0225 08:15:09.914673 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b6500f_8def_436c_afc6_a4d0a9d566af.slice/crio-2bd9e237023d25710fa46015e3c3a2ae3eda1a50401492ece4d36a657119dfe2 WatchSource:0}: Error finding container 2bd9e237023d25710fa46015e3c3a2ae3eda1a50401492ece4d36a657119dfe2: Status 404 returned error can't find the container with id 2bd9e237023d25710fa46015e3c3a2ae3eda1a50401492ece4d36a657119dfe2 Feb 25 08:15:10 crc kubenswrapper[4749]: I0225 08:15:10.237322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" event={"ID":"85b6500f-8def-436c-afc6-a4d0a9d566af","Type":"ContainerStarted","Data":"2bd9e237023d25710fa46015e3c3a2ae3eda1a50401492ece4d36a657119dfe2"} Feb 25 08:15:21 crc kubenswrapper[4749]: I0225 08:15:21.343412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" event={"ID":"85b6500f-8def-436c-afc6-a4d0a9d566af","Type":"ContainerStarted","Data":"893190b29f7a24bb6f87ac9780ae5edc46480d1078aad18b13ed51e665cd240c"} Feb 25 08:15:51 crc kubenswrapper[4749]: I0225 08:15:51.672227 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:15:51 crc kubenswrapper[4749]: I0225 08:15:51.672827 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:15:53 crc kubenswrapper[4749]: I0225 08:15:53.952582 4749 scope.go:117] "RemoveContainer" containerID="bb727634e9aedd0d8f24eb29fe19932476e05f109970239dd6485a974a293554" Feb 25 08:15:56 crc kubenswrapper[4749]: I0225 08:15:56.693845 4749 generic.go:334] "Generic (PLEG): container finished" podID="85b6500f-8def-436c-afc6-a4d0a9d566af" containerID="893190b29f7a24bb6f87ac9780ae5edc46480d1078aad18b13ed51e665cd240c" exitCode=0 Feb 25 08:15:56 crc kubenswrapper[4749]: I0225 08:15:56.694035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" event={"ID":"85b6500f-8def-436c-afc6-a4d0a9d566af","Type":"ContainerDied","Data":"893190b29f7a24bb6f87ac9780ae5edc46480d1078aad18b13ed51e665cd240c"} Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.802386 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.841219 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-v9qt9"] Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.852310 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-v9qt9"] Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.938426 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host\") pod \"85b6500f-8def-436c-afc6-a4d0a9d566af\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.938524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host" (OuterVolumeSpecName: "host") pod "85b6500f-8def-436c-afc6-a4d0a9d566af" (UID: "85b6500f-8def-436c-afc6-a4d0a9d566af"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.939049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbw77\" (UniqueName: \"kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77\") pod \"85b6500f-8def-436c-afc6-a4d0a9d566af\" (UID: \"85b6500f-8def-436c-afc6-a4d0a9d566af\") " Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.939821 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b6500f-8def-436c-afc6-a4d0a9d566af-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:57 crc kubenswrapper[4749]: I0225 08:15:57.962877 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77" (OuterVolumeSpecName: "kube-api-access-bbw77") pod "85b6500f-8def-436c-afc6-a4d0a9d566af" (UID: "85b6500f-8def-436c-afc6-a4d0a9d566af"). InnerVolumeSpecName "kube-api-access-bbw77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:15:58 crc kubenswrapper[4749]: I0225 08:15:58.041910 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbw77\" (UniqueName: \"kubernetes.io/projected/85b6500f-8def-436c-afc6-a4d0a9d566af-kube-api-access-bbw77\") on node \"crc\" DevicePath \"\"" Feb 25 08:15:58 crc kubenswrapper[4749]: I0225 08:15:58.713393 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd9e237023d25710fa46015e3c3a2ae3eda1a50401492ece4d36a657119dfe2" Feb 25 08:15:58 crc kubenswrapper[4749]: I0225 08:15:58.713810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-v9qt9" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.029259 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-j5kfj"] Feb 25 08:15:59 crc kubenswrapper[4749]: E0225 08:15:59.029648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b6500f-8def-436c-afc6-a4d0a9d566af" containerName="container-00" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.029664 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b6500f-8def-436c-afc6-a4d0a9d566af" containerName="container-00" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.029887 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b6500f-8def-436c-afc6-a4d0a9d566af" containerName="container-00" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.030457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.032272 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z5mdj"/"default-dockercfg-d7t76" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.164408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qntg4\" (UniqueName: \"kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.164492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.266416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qntg4\" (UniqueName: \"kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.266525 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.266664 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.290576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qntg4\" (UniqueName: \"kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4\") pod \"crc-debug-j5kfj\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.332428 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b6500f-8def-436c-afc6-a4d0a9d566af" path="/var/lib/kubelet/pods/85b6500f-8def-436c-afc6-a4d0a9d566af/volumes" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.346944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.723462 4749 generic.go:334] "Generic (PLEG): container finished" podID="f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" containerID="1511e1b884c4be7d125524adbde61b77361a1ad990ae94b06218537afe60974b" exitCode=0 Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.723647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" event={"ID":"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc","Type":"ContainerDied","Data":"1511e1b884c4be7d125524adbde61b77361a1ad990ae94b06218537afe60974b"} Feb 25 08:15:59 crc kubenswrapper[4749]: I0225 08:15:59.723851 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" event={"ID":"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc","Type":"ContainerStarted","Data":"f2d64a6de440294827f3de3945b37e99551b009845cbec8570372667f87fd6b8"} Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.133738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533456-qlk2b"] Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.135162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.137281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.138354 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.138361 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.146848 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533456-qlk2b"] Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.184467 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-j5kfj"] Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.193937 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-j5kfj"] Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.282912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9q72\" (UniqueName: \"kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72\") pod \"auto-csr-approver-29533456-qlk2b\" (UID: \"02af7096-2076-47c6-81c4-57cb8e3c0c31\") " pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.385562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9q72\" (UniqueName: \"kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72\") pod \"auto-csr-approver-29533456-qlk2b\" (UID: \"02af7096-2076-47c6-81c4-57cb8e3c0c31\") " pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.413800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9q72\" (UniqueName: \"kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72\") pod \"auto-csr-approver-29533456-qlk2b\" (UID: \"02af7096-2076-47c6-81c4-57cb8e3c0c31\") " pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.464109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.845066 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.920544 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533456-qlk2b"] Feb 25 08:16:00 crc kubenswrapper[4749]: W0225 08:16:00.925338 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02af7096_2076_47c6_81c4_57cb8e3c0c31.slice/crio-885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b WatchSource:0}: Error finding container 885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b: Status 404 returned error can't find the container with id 885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.996463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host\") pod \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.996529 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qntg4\" (UniqueName: \"kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4\") pod \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\" (UID: \"f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc\") " Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.996585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host" (OuterVolumeSpecName: "host") pod "f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" (UID: "f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:16:00 crc kubenswrapper[4749]: I0225 08:16:00.997140 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.002527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4" (OuterVolumeSpecName: "kube-api-access-qntg4") pod "f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" (UID: "f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc"). InnerVolumeSpecName "kube-api-access-qntg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.100629 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qntg4\" (UniqueName: \"kubernetes.io/projected/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc-kube-api-access-qntg4\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.334442 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" path="/var/lib/kubelet/pods/f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc/volumes" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.353639 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-b7mjr"] Feb 25 08:16:01 crc kubenswrapper[4749]: E0225 08:16:01.353992 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" containerName="container-00" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.354008 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" containerName="container-00" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.354198 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f789a23b-563e-4ecc-9d6b-86c3bfaa6ebc" containerName="container-00" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.354767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.507143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwhv\" (UniqueName: \"kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.508038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.611167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.611229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwhv\" (UniqueName: \"kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.611409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.633885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwhv\" (UniqueName: \"kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv\") pod \"crc-debug-b7mjr\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.680209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:01 crc kubenswrapper[4749]: W0225 08:16:01.731411 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52d670e3_fec3_4c0a_a171_c563a8722763.slice/crio-26ae9ef4c101a9e8f01b1a9bf78c8e8b26e03200208c1ec4ede618592587b4b8 WatchSource:0}: Error finding container 26ae9ef4c101a9e8f01b1a9bf78c8e8b26e03200208c1ec4ede618592587b4b8: Status 404 returned error can't find the container with id 26ae9ef4c101a9e8f01b1a9bf78c8e8b26e03200208c1ec4ede618592587b4b8 Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.746075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" event={"ID":"02af7096-2076-47c6-81c4-57cb8e3c0c31","Type":"ContainerStarted","Data":"885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b"} Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.747872 4749 scope.go:117] "RemoveContainer" containerID="1511e1b884c4be7d125524adbde61b77361a1ad990ae94b06218537afe60974b" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.747922 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-j5kfj" Feb 25 08:16:01 crc kubenswrapper[4749]: I0225 08:16:01.749049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" event={"ID":"52d670e3-fec3-4c0a-a171-c563a8722763","Type":"ContainerStarted","Data":"26ae9ef4c101a9e8f01b1a9bf78c8e8b26e03200208c1ec4ede618592587b4b8"} Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.758573 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d670e3-fec3-4c0a-a171-c563a8722763" containerID="af9569b67b8f797e15505d16f1404948b8238fecf8d4e6e3dfa14ecf0df3575a" exitCode=0 Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.758762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" event={"ID":"52d670e3-fec3-4c0a-a171-c563a8722763","Type":"ContainerDied","Data":"af9569b67b8f797e15505d16f1404948b8238fecf8d4e6e3dfa14ecf0df3575a"} Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.760816 4749 generic.go:334] "Generic (PLEG): container finished" podID="02af7096-2076-47c6-81c4-57cb8e3c0c31" containerID="1e3a9a928ef7db64390c5e24814632685a0662fdfaf9b83d7bc68d6ad697d0d8" exitCode=0 Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.760884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" event={"ID":"02af7096-2076-47c6-81c4-57cb8e3c0c31","Type":"ContainerDied","Data":"1e3a9a928ef7db64390c5e24814632685a0662fdfaf9b83d7bc68d6ad697d0d8"} Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.799871 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-b7mjr"] Feb 25 08:16:02 crc kubenswrapper[4749]: I0225 08:16:02.814512 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5mdj/crc-debug-b7mjr"] Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.428534 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:03 crc kubenswrapper[4749]: E0225 08:16:03.429039 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d670e3-fec3-4c0a-a171-c563a8722763" containerName="container-00" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.429060 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d670e3-fec3-4c0a-a171-c563a8722763" containerName="container-00" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.429283 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d670e3-fec3-4c0a-a171-c563a8722763" containerName="container-00" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.430939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.447093 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.551727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.551775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbqb\" (UniqueName: \"kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.551802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.658107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.658438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbqb\" (UniqueName: \"kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.658470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.658555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.659064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.676236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbqb\" (UniqueName: \"kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb\") pod \"redhat-marketplace-cmc5v\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.789916 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.895968 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.964446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host\") pod \"52d670e3-fec3-4c0a-a171-c563a8722763\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.964539 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwhv\" (UniqueName: \"kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv\") pod \"52d670e3-fec3-4c0a-a171-c563a8722763\" (UID: \"52d670e3-fec3-4c0a-a171-c563a8722763\") " Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.964629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host" (OuterVolumeSpecName: "host") pod "52d670e3-fec3-4c0a-a171-c563a8722763" (UID: "52d670e3-fec3-4c0a-a171-c563a8722763"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.965087 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52d670e3-fec3-4c0a-a171-c563a8722763-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:03 crc kubenswrapper[4749]: I0225 08:16:03.970689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv" (OuterVolumeSpecName: "kube-api-access-4dwhv") pod "52d670e3-fec3-4c0a-a171-c563a8722763" (UID: "52d670e3-fec3-4c0a-a171-c563a8722763"). InnerVolumeSpecName "kube-api-access-4dwhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.065669 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwhv\" (UniqueName: \"kubernetes.io/projected/52d670e3-fec3-4c0a-a171-c563a8722763-kube-api-access-4dwhv\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.109537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:04 crc kubenswrapper[4749]: W0225 08:16:04.254811 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24c31bd_364e_4a4c_abdb_327239f018bf.slice/crio-5237376bacd7ca05f32c08b7e19403d7c357c29ede68972acc8a972df6bfe21c WatchSource:0}: Error finding container 5237376bacd7ca05f32c08b7e19403d7c357c29ede68972acc8a972df6bfe21c: Status 404 returned error can't find the container with id 5237376bacd7ca05f32c08b7e19403d7c357c29ede68972acc8a972df6bfe21c Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.256050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.268427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9q72\" (UniqueName: \"kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72\") pod \"02af7096-2076-47c6-81c4-57cb8e3c0c31\" (UID: \"02af7096-2076-47c6-81c4-57cb8e3c0c31\") " Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.271778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72" (OuterVolumeSpecName: "kube-api-access-w9q72") pod "02af7096-2076-47c6-81c4-57cb8e3c0c31" (UID: "02af7096-2076-47c6-81c4-57cb8e3c0c31"). InnerVolumeSpecName "kube-api-access-w9q72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.371442 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9q72\" (UniqueName: \"kubernetes.io/projected/02af7096-2076-47c6-81c4-57cb8e3c0c31-kube-api-access-w9q72\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.779442 4749 generic.go:334] "Generic (PLEG): container finished" podID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerID="a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e" exitCode=0 Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.779530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerDied","Data":"a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e"} Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.779556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerStarted","Data":"5237376bacd7ca05f32c08b7e19403d7c357c29ede68972acc8a972df6bfe21c"} Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.782654 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.782669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533456-qlk2b" event={"ID":"02af7096-2076-47c6-81c4-57cb8e3c0c31","Type":"ContainerDied","Data":"885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b"} Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.782730 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885c06401908a5cd40e4d29a340f74ff85f30da6727e9a3e56f583f27a97fa5b" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.788210 4749 scope.go:117] "RemoveContainer" containerID="af9569b67b8f797e15505d16f1404948b8238fecf8d4e6e3dfa14ecf0df3575a" Feb 25 08:16:04 crc kubenswrapper[4749]: I0225 08:16:04.788275 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/crc-debug-b7mjr" Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.188170 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533450-df5b6"] Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.204135 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533450-df5b6"] Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.331965 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d670e3-fec3-4c0a-a171-c563a8722763" path="/var/lib/kubelet/pods/52d670e3-fec3-4c0a-a171-c563a8722763/volumes" Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.332660 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96b5f53-402f-41ab-842c-6fb3c75c88b1" path="/var/lib/kubelet/pods/e96b5f53-402f-41ab-842c-6fb3c75c88b1/volumes" Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.799729 4749 generic.go:334] "Generic (PLEG): container finished" podID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerID="e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04" exitCode=0 Feb 25 08:16:05 crc kubenswrapper[4749]: I0225 08:16:05.799775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerDied","Data":"e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04"} Feb 25 08:16:05 crc kubenswrapper[4749]: E0225 08:16:05.899780 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24c31bd_364e_4a4c_abdb_327239f018bf.slice/crio-e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda24c31bd_364e_4a4c_abdb_327239f018bf.slice/crio-conmon-e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04.scope\": RecentStats: unable to find data in memory cache]" Feb 25 08:16:06 crc kubenswrapper[4749]: I0225 08:16:06.814403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerStarted","Data":"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180"} Feb 25 08:16:06 crc kubenswrapper[4749]: I0225 08:16:06.845266 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cmc5v" podStartSLOduration=2.425851294 podStartE2EDuration="3.845251546s" podCreationTimestamp="2026-02-25 08:16:03 +0000 UTC" firstStartedPulling="2026-02-25 08:16:04.781378753 +0000 UTC m=+3518.143204783" lastFinishedPulling="2026-02-25 08:16:06.200778975 +0000 UTC m=+3519.562605035" observedRunningTime="2026-02-25 08:16:06.839917337 +0000 UTC m=+3520.201743357" watchObservedRunningTime="2026-02-25 08:16:06.845251546 +0000 UTC m=+3520.207077566" Feb 25 08:16:13 crc kubenswrapper[4749]: I0225 08:16:13.791265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:13 crc kubenswrapper[4749]: I0225 08:16:13.791799 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:13 crc kubenswrapper[4749]: I0225 08:16:13.845097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:13 crc kubenswrapper[4749]: I0225 08:16:13.930264 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:14 crc kubenswrapper[4749]: I0225 08:16:14.088934 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:15 crc kubenswrapper[4749]: I0225 08:16:15.898588 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cmc5v" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="registry-server" containerID="cri-o://5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180" gracePeriod=2 Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.404248 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.498777 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content\") pod \"a24c31bd-364e-4a4c-abdb-327239f018bf\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.499411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities\") pod \"a24c31bd-364e-4a4c-abdb-327239f018bf\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.499485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjbqb\" (UniqueName: \"kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb\") pod \"a24c31bd-364e-4a4c-abdb-327239f018bf\" (UID: \"a24c31bd-364e-4a4c-abdb-327239f018bf\") " Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.500226 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities" (OuterVolumeSpecName: "utilities") pod "a24c31bd-364e-4a4c-abdb-327239f018bf" (UID: "a24c31bd-364e-4a4c-abdb-327239f018bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.507640 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb" (OuterVolumeSpecName: "kube-api-access-wjbqb") pod "a24c31bd-364e-4a4c-abdb-327239f018bf" (UID: "a24c31bd-364e-4a4c-abdb-327239f018bf"). InnerVolumeSpecName "kube-api-access-wjbqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.534886 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a24c31bd-364e-4a4c-abdb-327239f018bf" (UID: "a24c31bd-364e-4a4c-abdb-327239f018bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.602411 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.602461 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjbqb\" (UniqueName: \"kubernetes.io/projected/a24c31bd-364e-4a4c-abdb-327239f018bf-kube-api-access-wjbqb\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.602481 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24c31bd-364e-4a4c-abdb-327239f018bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.910730 4749 generic.go:334] "Generic (PLEG): container finished" podID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerID="5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180" exitCode=0 Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.910796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerDied","Data":"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180"} Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.910877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmc5v" event={"ID":"a24c31bd-364e-4a4c-abdb-327239f018bf","Type":"ContainerDied","Data":"5237376bacd7ca05f32c08b7e19403d7c357c29ede68972acc8a972df6bfe21c"} Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.910910 4749 scope.go:117] "RemoveContainer" containerID="5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.910916 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmc5v" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.957872 4749 scope.go:117] "RemoveContainer" containerID="e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04" Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.960087 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.973800 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmc5v"] Feb 25 08:16:16 crc kubenswrapper[4749]: I0225 08:16:16.994513 4749 scope.go:117] "RemoveContainer" containerID="a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.055101 4749 scope.go:117] "RemoveContainer" containerID="5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180" Feb 25 08:16:17 crc kubenswrapper[4749]: E0225 08:16:17.055835 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180\": container with ID starting with 5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180 not found: ID does not exist" containerID="5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.055908 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180"} err="failed to get container status \"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180\": rpc error: code = NotFound desc = could not find container \"5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180\": container with ID starting with 5b39b0461a2071b7f96414451b9735865f5d36cbe37fb280ec5415d89a0e6180 not found: ID does not exist" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.055950 4749 scope.go:117] "RemoveContainer" containerID="e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04" Feb 25 08:16:17 crc kubenswrapper[4749]: E0225 08:16:17.056366 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04\": container with ID starting with e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04 not found: ID does not exist" containerID="e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.056405 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04"} err="failed to get container status \"e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04\": rpc error: code = NotFound desc = could not find container \"e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04\": container with ID starting with e79f684cb9996182640e94d843e803e92cd2fc7379bf75e99d08ea35f04b8d04 not found: ID does not exist" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.056431 4749 scope.go:117] "RemoveContainer" containerID="a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e" Feb 25 08:16:17 crc kubenswrapper[4749]: E0225 08:16:17.057002 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e\": container with ID starting with a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e not found: ID does not exist" containerID="a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.057122 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e"} err="failed to get container status \"a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e\": rpc error: code = NotFound desc = could not find container \"a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e\": container with ID starting with a048775aa28420c553bbf95cd1662b288939afe6dd5a6c87b8e52b210341378e not found: ID does not exist" Feb 25 08:16:17 crc kubenswrapper[4749]: I0225 08:16:17.340081 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" path="/var/lib/kubelet/pods/a24c31bd-364e-4a4c-abdb-327239f018bf/volumes" Feb 25 08:16:21 crc kubenswrapper[4749]: I0225 08:16:21.672146 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:16:21 crc kubenswrapper[4749]: I0225 08:16:21.672452 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:16:21 crc kubenswrapper[4749]: I0225 08:16:21.851473 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b57cbd48-btws7_35a9ddc1-5f7b-4a22-8b8f-45b895c6731c/barbican-api/0.log" Feb 25 08:16:21 crc kubenswrapper[4749]: I0225 08:16:21.957463 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b57cbd48-btws7_35a9ddc1-5f7b-4a22-8b8f-45b895c6731c/barbican-api-log/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.094225 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d5c5c5846-h56g5_004d8426-4842-4f64-ba76-6ee6afed85de/barbican-keystone-listener/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.114981 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d5c5c5846-h56g5_004d8426-4842-4f64-ba76-6ee6afed85de/barbican-keystone-listener-log/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.297878 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-df9d99775-z76dt_9111c168-e7cf-494e-a603-93b1f9db0b73/barbican-worker/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.306952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-df9d99775-z76dt_9111c168-e7cf-494e-a603-93b1f9db0b73/barbican-worker-log/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.416420 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq_42b06635-d369-4299-92f6-e912f4d811df/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.545937 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/ceilometer-central-agent/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.600564 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/ceilometer-notification-agent/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.647725 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/proxy-httpd/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.692994 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/sg-core/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.834816 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_65d2cd6c-9f4d-4d9d-9032-7798a57b7aec/cinder-api/0.log" Feb 25 08:16:22 crc kubenswrapper[4749]: I0225 08:16:22.853918 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_65d2cd6c-9f4d-4d9d-9032-7798a57b7aec/cinder-api-log/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.195379 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6ba75dd-bf4c-4d7e-88b3-cf11679c231a/probe/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.204489 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6ba75dd-bf4c-4d7e-88b3-cf11679c231a/cinder-scheduler/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.300614 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vfqng_f2f01883-686b-4aed-9458-ee14d1c3eb10/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.407615 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk_4237ec4c-49e0-4c6d-8a5c-d67583610f3d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.510634 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/init/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.697091 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc_fd2d74c2-5270-4697-b4fb-47a5affbbf68/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.707692 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/dnsmasq-dns/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.711494 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/init/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.893709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3957039-d105-44aa-865d-08cf1bd562bf/glance-log/0.log" Feb 25 08:16:23 crc kubenswrapper[4749]: I0225 08:16:23.904078 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3957039-d105-44aa-865d-08cf1bd562bf/glance-httpd/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.071779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64d42008-7546-4307-9953-37a51af1df8a/glance-log/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.086987 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64d42008-7546-4307-9953-37a51af1df8a/glance-httpd/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.222460 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75597b5c88-58jkm_43b78d96-31fe-4729-aacf-09c66c121861/horizon/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.380910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz_7e445ab6-1f18-49fe-b3f4-0921714e4d08/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.525624 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75597b5c88-58jkm_43b78d96-31fe-4729-aacf-09c66c121861/horizon-log/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.583859 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-vjvzc_2c3eb600-4864-4229-bbfc-6b24211fc914/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.779650 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29533441-4v5xd_4b21ba16-5e25-41fd-afbb-82072bfdc006/keystone-cron/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.814910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-567ffd99f4-495rj_d2f908fa-2e8d-44bd-ac10-d745ce196bda/keystone-api/0.log" Feb 25 08:16:24 crc kubenswrapper[4749]: I0225 08:16:24.957055 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_11165cb9-7a9a-425b-8eea-42e61a784a57/kube-state-metrics/0.log" Feb 25 08:16:25 crc kubenswrapper[4749]: I0225 08:16:25.009807 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw_04b421fd-689e-4212-85a9-ffaecfe63fbe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:25 crc kubenswrapper[4749]: I0225 08:16:25.365179 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-668b654645-4xlm5_8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6/neutron-api/0.log" Feb 25 08:16:25 crc kubenswrapper[4749]: I0225 08:16:25.393847 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-668b654645-4xlm5_8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6/neutron-httpd/0.log" Feb 25 08:16:25 crc kubenswrapper[4749]: I0225 08:16:25.610668 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95_f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.031063 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1bc34300-d817-4b83-865e-2cc2c5ffb31a/nova-cell0-conductor-conductor/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.137026 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ea427259-a5cd-455d-a3c3-7031a607e42c/nova-api-log/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.229895 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ea427259-a5cd-455d-a3c3-7031a607e42c/nova-api-api/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.332195 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e800eb5f-0979-4ad3-8f0a-1adab77e1259/nova-cell1-conductor-conductor/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.415942 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.586546 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l2dl2_9b501ea6-b5f9-497b-9da6-072e7a0fde7a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.716333 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8b4782e-5a90-4774-819b-dc12f4c1b585/nova-metadata-log/0.log" Feb 25 08:16:26 crc kubenswrapper[4749]: I0225 08:16:26.946898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79/nova-scheduler-scheduler/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.035386 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/mysql-bootstrap/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.293127 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/mysql-bootstrap/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.299274 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/galera/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.474294 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/mysql-bootstrap/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.711652 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/mysql-bootstrap/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.752774 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/galera/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.794308 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8b4782e-5a90-4774-819b-dc12f4c1b585/nova-metadata-metadata/0.log" Feb 25 08:16:27 crc kubenswrapper[4749]: I0225 08:16:27.888441 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a7c13290-8c43-443a-8563-ea54a96c975a/openstackclient/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.039191 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9jt2g_4197a74b-885d-41df-8484-05e645656b2a/ovn-controller/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.103568 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rtctt_cd7c992f-27dd-409a-a7db-34b40e2ed6eb/openstack-network-exporter/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.255855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server-init/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.442868 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovs-vswitchd/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.489725 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.493564 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server-init/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.673291 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5nv2h_4671b264-81d8-4dfb-9bb9-33a1f2c46068/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.727149 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d644537f-b5d7-4f78-be98-d61b2f1d6ac3/openstack-network-exporter/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.818066 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d644537f-b5d7-4f78-be98-d61b2f1d6ac3/ovn-northd/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.949570 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8fa832b-da81-48f1-b4d6-e72688303d93/openstack-network-exporter/0.log" Feb 25 08:16:28 crc kubenswrapper[4749]: I0225 08:16:28.958486 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8fa832b-da81-48f1-b4d6-e72688303d93/ovsdbserver-nb/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.090377 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_afd2522b-bd24-455e-bc5f-a62caba2ff23/openstack-network-exporter/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.138584 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_afd2522b-bd24-455e-bc5f-a62caba2ff23/ovsdbserver-sb/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.423426 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/setup-container/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.450281 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5578bc7b56-qlg29_90ab2780-2dee-40f9-a4c6-529e08d4de0b/placement-api/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.452636 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5578bc7b56-qlg29_90ab2780-2dee-40f9-a4c6-529e08d4de0b/placement-log/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.654269 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/rabbitmq/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.695003 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/setup-container/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.710911 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/setup-container/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.893709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/rabbitmq/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.906376 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd_daf53e51-e69a-43fd-bfa4-50ffcd4c9234/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:29 crc kubenswrapper[4749]: I0225 08:16:29.927030 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/setup-container/0.log" Feb 25 08:16:30 crc kubenswrapper[4749]: I0225 08:16:30.307449 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hx724_9653d4b7-617d-4ca2-a596-3f4ab7086b05/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:30 crc kubenswrapper[4749]: I0225 08:16:30.435813 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf_911f3de9-9115-4be2-98f8-a1e26e35387a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:30 crc kubenswrapper[4749]: I0225 08:16:30.609679 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nfvzb_11ec0661-d541-4c78-bc67-3bcb2e908694/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:30 crc kubenswrapper[4749]: I0225 08:16:30.671416 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2w792_357114b3-24d4-4f6f-ba27-99c4314d110d/ssh-known-hosts-edpm-deployment/0.log" Feb 25 08:16:30 crc kubenswrapper[4749]: I0225 08:16:30.854747 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6846d6d889-85shz_ff8a7476-3a62-4af3-a5bb-8a4b8d60108f/proxy-server/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.019808 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6846d6d889-85shz_ff8a7476-3a62-4af3-a5bb-8a4b8d60108f/proxy-httpd/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.066517 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tw69n_9d9bb0e5-da1f-480e-8a59-e00767290acc/swift-ring-rebalance/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.161997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-auditor/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.252573 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-reaper/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.365618 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-server/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.389138 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-replicator/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.420340 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-auditor/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.432048 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-replicator/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.549283 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-server/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.597680 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-expirer/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.633305 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-updater/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.680833 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-auditor/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.752497 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-replicator/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.818506 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-server/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.873344 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-updater/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.940244 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/rsync/0.log" Feb 25 08:16:31 crc kubenswrapper[4749]: I0225 08:16:31.991761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/swift-recon-cron/0.log" Feb 25 08:16:32 crc kubenswrapper[4749]: I0225 08:16:32.155337 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq_18d17a7c-1c9d-47bc-818d-c2f567dfe075/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:32 crc kubenswrapper[4749]: I0225 08:16:32.266453 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6ffbba01-9e0c-4754-a378-68eaba4c858e/tempest-tests-tempest-tests-runner/0.log" Feb 25 08:16:32 crc kubenswrapper[4749]: I0225 08:16:32.423110 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_262268cf-ce8f-4780-9acc-a642f473b902/test-operator-logs-container/0.log" Feb 25 08:16:32 crc kubenswrapper[4749]: I0225 08:16:32.477084 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42z9m_e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:16:39 crc kubenswrapper[4749]: I0225 08:16:39.690066 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_df3ffcd0-1fa9-4c25-a331-33baf2a3acfd/memcached/0.log" Feb 25 08:16:51 crc kubenswrapper[4749]: I0225 08:16:51.671988 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:16:51 crc kubenswrapper[4749]: I0225 08:16:51.672690 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:16:51 crc kubenswrapper[4749]: I0225 08:16:51.672749 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:16:51 crc kubenswrapper[4749]: I0225 08:16:51.673671 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:16:51 crc kubenswrapper[4749]: I0225 08:16:51.673758 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18" gracePeriod=600 Feb 25 08:16:52 crc kubenswrapper[4749]: I0225 08:16:52.246122 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18" exitCode=0 Feb 25 08:16:52 crc kubenswrapper[4749]: I0225 08:16:52.246342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18"} Feb 25 08:16:52 crc kubenswrapper[4749]: I0225 08:16:52.246755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a"} Feb 25 08:16:52 crc kubenswrapper[4749]: I0225 08:16:52.246793 4749 scope.go:117] "RemoveContainer" containerID="603a119836953c06674ae1a0f6e98e346407e8bbd2ae4cd7c23a73f2958679b9" Feb 25 08:16:54 crc kubenswrapper[4749]: I0225 08:16:54.039457 4749 scope.go:117] "RemoveContainer" containerID="62bb5c79a0ab5592146ed35157de64dfbfb5aeb3ee3e5d9b3c7fab5d7062d4a1" Feb 25 08:16:58 crc kubenswrapper[4749]: I0225 08:16:58.728557 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:16:58 crc kubenswrapper[4749]: I0225 08:16:58.898372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.012386 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.016393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.192718 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.245784 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/extract/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.261369 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:16:59 crc kubenswrapper[4749]: I0225 08:16:59.871292 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-4jdng_a8412981-280e-4153-b15e-7a5df751e110/manager/0.log" Feb 25 08:17:00 crc kubenswrapper[4749]: I0225 08:17:00.202952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-7h9cc_23e28440-3881-4d38-885e-3a20842b117d/manager/0.log" Feb 25 08:17:00 crc kubenswrapper[4749]: I0225 08:17:00.391373 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-9tv6m_dc40a3f9-e350-41dc-b13d-86ae3e46f551/manager/0.log" Feb 25 08:17:00 crc kubenswrapper[4749]: I0225 08:17:00.595067 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-2tccs_0a7bf49d-b6d9-474b-b97c-cb555aa93f8a/manager/0.log" Feb 25 08:17:00 crc kubenswrapper[4749]: I0225 08:17:00.724525 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-9jh8w_0d2964a7-7341-4f1f-ab51-b648ea057535/manager/0.log" Feb 25 08:17:00 crc kubenswrapper[4749]: I0225 08:17:00.913236 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-5nbpj_a8e8e364-0b06-4b3d-9faf-4c7c3c233060/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.166184 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-rtn6b_b1ea3312-6b21-440a-b617-5681d286bcc4/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.249173 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-zjvss_17bb447b-e6f8-4a04-98ea-8559cbd26d34/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.412840 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-b2j2v_0594f0f5-d0ce-4c43-8572-3dc16130152e/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.573918 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-g4h2l_b394d64a-2cf2-4cad-9b51-adbf56cb696c/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.809586 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-sptf9_b54d6e5b-b74d-4e25-808a-20383b1b02e0/manager/0.log" Feb 25 08:17:01 crc kubenswrapper[4749]: I0225 08:17:01.911845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-q64h7_c881d86e-332d-419d-8c8a-9b7dfafe8c3c/manager/0.log" Feb 25 08:17:02 crc kubenswrapper[4749]: I0225 08:17:02.049612 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-2lrvt_9fcf9db9-6abb-445d-aa8c-8d5e60431838/manager/0.log" Feb 25 08:17:02 crc kubenswrapper[4749]: I0225 08:17:02.242823 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c_6f55e1ee-705c-409e-b34a-77232bf089eb/manager/0.log" Feb 25 08:17:02 crc kubenswrapper[4749]: I0225 08:17:02.563255 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57b9579b7-7f7fh_3effb281-c000-4831-89a1-8b85dfc219b3/operator/0.log" Feb 25 08:17:02 crc kubenswrapper[4749]: I0225 08:17:02.756913 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tk29m_7b2662ca-97d0-4f81-b90a-c2735bd2a62a/registry-server/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.027461 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-5gt8h_9bae09cb-1d81-4971-82bc-84d87a6dca77/manager/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.072358 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-n4v8h_e1c9da55-b9bc-4bb7-b73a-ca73c928f333/manager/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.256643 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vx8bh_d3e727f4-f059-41f3-94d1-fef7a644f2b2/operator/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.381855 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-nqhfv_e1152a21-82e3-4a7f-92c8-8633abeecb26/manager/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.766384 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-wv258_6dc6924f-3556-4ac1-b787-57fa3e20297f/manager/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.827293 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-wh7zn_0ada39b5-592e-448e-9a12-2f9e95906a74/manager/0.log" Feb 25 08:17:03 crc kubenswrapper[4749]: I0225 08:17:03.996931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-7fzsq_6b0526d5-8f6f-47d4-87f5-0deb2c091848/manager/0.log" Feb 25 08:17:04 crc kubenswrapper[4749]: I0225 08:17:04.122293 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6699bbbd4-8bgxl_37a9e86e-0ee0-4447-910a-a185f4681508/manager/0.log" Feb 25 08:17:05 crc kubenswrapper[4749]: I0225 08:17:05.423906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-fn75h_12921b78-d19b-4a5c-be9e-5cf412b88186/manager/0.log" Feb 25 08:17:25 crc kubenswrapper[4749]: I0225 08:17:25.452524 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2rbvc_85efff27-fc96-4191-9733-a6a2434a723c/control-plane-machine-set-operator/0.log" Feb 25 08:17:25 crc kubenswrapper[4749]: I0225 08:17:25.608148 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mqgw7_723f3a22-e0cb-4b03-952d-7f4e6aece976/machine-api-operator/0.log" Feb 25 08:17:25 crc kubenswrapper[4749]: I0225 08:17:25.636079 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mqgw7_723f3a22-e0cb-4b03-952d-7f4e6aece976/kube-rbac-proxy/0.log" Feb 25 08:17:40 crc kubenswrapper[4749]: I0225 08:17:40.466760 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-6858f_280567a7-b82f-4767-93b7-725ad0ff927e/cert-manager-controller/0.log" Feb 25 08:17:40 crc kubenswrapper[4749]: I0225 08:17:40.628877 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rtblf_e03b662d-9431-4217-b126-b2a7db9ab5e4/cert-manager-cainjector/0.log" Feb 25 08:17:40 crc kubenswrapper[4749]: I0225 08:17:40.699108 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-h5fwj_acd8dc37-5680-4838-aa0e-bf79c4209283/cert-manager-webhook/0.log" Feb 25 08:17:55 crc kubenswrapper[4749]: I0225 08:17:55.860356 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-gvgqz_3b819184-1695-4312-a0f4-0e0bad53a7d7/nmstate-console-plugin/0.log" Feb 25 08:17:56 crc kubenswrapper[4749]: I0225 08:17:56.093406 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-wnqc6_7ba6588d-93c3-481d-8606-1b91fee0267a/kube-rbac-proxy/0.log" Feb 25 08:17:56 crc kubenswrapper[4749]: I0225 08:17:56.116033 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xpxjw_eef1bf93-8a75-4ff2-b172-7601b6861aef/nmstate-handler/0.log" Feb 25 08:17:56 crc kubenswrapper[4749]: I0225 08:17:56.132809 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-wnqc6_7ba6588d-93c3-481d-8606-1b91fee0267a/nmstate-metrics/0.log" Feb 25 08:17:56 crc kubenswrapper[4749]: I0225 08:17:56.277178 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-tpddj_956da30d-e0b4-45a1-a3b4-773cdada4e30/nmstate-operator/0.log" Feb 25 08:17:56 crc kubenswrapper[4749]: I0225 08:17:56.323920 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-6flq6_ed80efda-7ad1-4992-9d76-f94d50e57216/nmstate-webhook/0.log" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.150948 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533458-fps56"] Feb 25 08:18:00 crc kubenswrapper[4749]: E0225 08:18:00.151974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="extract-utilities" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.151994 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="extract-utilities" Feb 25 08:18:00 crc kubenswrapper[4749]: E0225 08:18:00.152021 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="extract-content" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.152029 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="extract-content" Feb 25 08:18:00 crc kubenswrapper[4749]: E0225 08:18:00.152056 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02af7096-2076-47c6-81c4-57cb8e3c0c31" containerName="oc" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.152064 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02af7096-2076-47c6-81c4-57cb8e3c0c31" containerName="oc" Feb 25 08:18:00 crc kubenswrapper[4749]: E0225 08:18:00.152077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="registry-server" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.152085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="registry-server" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.152281 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24c31bd-364e-4a4c-abdb-327239f018bf" containerName="registry-server" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.152299 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="02af7096-2076-47c6-81c4-57cb8e3c0c31" containerName="oc" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.153006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.155035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.155045 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.157250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.163143 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533458-fps56"] Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.262736 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22668\" (UniqueName: \"kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668\") pod \"auto-csr-approver-29533458-fps56\" (UID: \"0474ed4a-c9df-415c-abfe-a652ba550427\") " pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.364820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22668\" (UniqueName: \"kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668\") pod \"auto-csr-approver-29533458-fps56\" (UID: \"0474ed4a-c9df-415c-abfe-a652ba550427\") " pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.398078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22668\" (UniqueName: \"kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668\") pod \"auto-csr-approver-29533458-fps56\" (UID: \"0474ed4a-c9df-415c-abfe-a652ba550427\") " pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.473436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:00 crc kubenswrapper[4749]: I0225 08:18:00.992313 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533458-fps56"] Feb 25 08:18:01 crc kubenswrapper[4749]: I0225 08:18:01.954655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533458-fps56" event={"ID":"0474ed4a-c9df-415c-abfe-a652ba550427","Type":"ContainerStarted","Data":"64d24bc3bb177bf82fe6487b1ca962d28466e800553f77294259e1128596da98"} Feb 25 08:18:02 crc kubenswrapper[4749]: I0225 08:18:02.963043 4749 generic.go:334] "Generic (PLEG): container finished" podID="0474ed4a-c9df-415c-abfe-a652ba550427" containerID="8a5204df938b3f64d23f598e19e8b44d7de36c38e0bbb8de2e39a028aa99ed8a" exitCode=0 Feb 25 08:18:02 crc kubenswrapper[4749]: I0225 08:18:02.963104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533458-fps56" event={"ID":"0474ed4a-c9df-415c-abfe-a652ba550427","Type":"ContainerDied","Data":"8a5204df938b3f64d23f598e19e8b44d7de36c38e0bbb8de2e39a028aa99ed8a"} Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.336717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.437405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22668\" (UniqueName: \"kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668\") pod \"0474ed4a-c9df-415c-abfe-a652ba550427\" (UID: \"0474ed4a-c9df-415c-abfe-a652ba550427\") " Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.443569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668" (OuterVolumeSpecName: "kube-api-access-22668") pod "0474ed4a-c9df-415c-abfe-a652ba550427" (UID: "0474ed4a-c9df-415c-abfe-a652ba550427"). InnerVolumeSpecName "kube-api-access-22668". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.539531 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22668\" (UniqueName: \"kubernetes.io/projected/0474ed4a-c9df-415c-abfe-a652ba550427-kube-api-access-22668\") on node \"crc\" DevicePath \"\"" Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.989470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533458-fps56" event={"ID":"0474ed4a-c9df-415c-abfe-a652ba550427","Type":"ContainerDied","Data":"64d24bc3bb177bf82fe6487b1ca962d28466e800553f77294259e1128596da98"} Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.990167 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d24bc3bb177bf82fe6487b1ca962d28466e800553f77294259e1128596da98" Feb 25 08:18:04 crc kubenswrapper[4749]: I0225 08:18:04.989648 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533458-fps56" Feb 25 08:18:05 crc kubenswrapper[4749]: I0225 08:18:05.436886 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533452-4nb79"] Feb 25 08:18:05 crc kubenswrapper[4749]: I0225 08:18:05.454093 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533452-4nb79"] Feb 25 08:18:07 crc kubenswrapper[4749]: I0225 08:18:07.347762 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267d7778-eb07-471d-a5f2-c9dc773b6e0f" path="/var/lib/kubelet/pods/267d7778-eb07-471d-a5f2-c9dc773b6e0f/volumes" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.595305 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jz8zq_9c8c2184-5984-4567-9de5-0141b3fc7fcd/kube-rbac-proxy/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.699991 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jz8zq_9c8c2184-5984-4567-9de5-0141b3fc7fcd/controller/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.780336 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.906572 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.916741 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.975325 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:18:25 crc kubenswrapper[4749]: I0225 08:18:25.976874 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.140985 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.142972 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.166034 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.168639 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.357636 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/controller/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.361116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.364871 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.376941 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.530256 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/kube-rbac-proxy/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.600536 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/frr-metrics/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.654224 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/kube-rbac-proxy-frr/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.769155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/reloader/0.log" Feb 25 08:18:26 crc kubenswrapper[4749]: I0225 08:18:26.918153 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-lh45s_3bb90660-64f8-43f6-b7c0-a4449f75c9fb/frr-k8s-webhook-server/0.log" Feb 25 08:18:27 crc kubenswrapper[4749]: I0225 08:18:27.065562 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56887f7db6-72j45_3ae3ac00-e47a-4cc0-ba56-9f0f885163ca/manager/0.log" Feb 25 08:18:27 crc kubenswrapper[4749]: I0225 08:18:27.270969 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-96dfd7b56-f9wq6_aacb357e-4983-4cb4-86df-6fd3119e8b15/webhook-server/0.log" Feb 25 08:18:27 crc kubenswrapper[4749]: I0225 08:18:27.375149 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k7r4k_f9ece4c1-da53-4718-8a3a-aa9c0fd930da/kube-rbac-proxy/0.log" Feb 25 08:18:27 crc kubenswrapper[4749]: I0225 08:18:27.930242 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k7r4k_f9ece4c1-da53-4718-8a3a-aa9c0fd930da/speaker/0.log" Feb 25 08:18:28 crc kubenswrapper[4749]: I0225 08:18:28.222181 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/frr/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.334981 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.498980 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.508684 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.523519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.674842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.681350 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/extract/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.689798 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:18:42 crc kubenswrapper[4749]: I0225 08:18:42.849040 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.012148 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.024459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.025647 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.183431 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.185236 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.417561 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.561650 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.571789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.626262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.773943 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/registry-server/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.781115 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.832940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:18:43 crc kubenswrapper[4749]: I0225 08:18:43.961528 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.108049 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/registry-server/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.239431 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.240347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.240465 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.399349 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.416179 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.421354 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/extract/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.583777 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pk7lm_e899a950-b7af-4fe2-b9db-856858e051fc/marketplace-operator/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.597978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.776946 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.792743 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.796701 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.958200 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:18:44 crc kubenswrapper[4749]: I0225 08:18:44.969445 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.125427 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/registry-server/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.187758 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.298069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.301429 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.321398 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.478906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.494015 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:18:45 crc kubenswrapper[4749]: I0225 08:18:45.990471 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/registry-server/0.log" Feb 25 08:18:54 crc kubenswrapper[4749]: I0225 08:18:54.188156 4749 scope.go:117] "RemoveContainer" containerID="8701b8d76f7d5a7012cbe3480c1c6a5a4bdafbdc4b0778dbfc54114f154f179c" Feb 25 08:19:21 crc kubenswrapper[4749]: I0225 08:19:21.671979 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:19:21 crc kubenswrapper[4749]: I0225 08:19:21.672569 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:19:51 crc kubenswrapper[4749]: I0225 08:19:51.671288 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:19:51 crc kubenswrapper[4749]: I0225 08:19:51.671972 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.163855 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533460-hb9dm"] Feb 25 08:20:00 crc kubenswrapper[4749]: E0225 08:20:00.164960 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0474ed4a-c9df-415c-abfe-a652ba550427" containerName="oc" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.164981 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0474ed4a-c9df-415c-abfe-a652ba550427" containerName="oc" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.165252 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0474ed4a-c9df-415c-abfe-a652ba550427" containerName="oc" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.166013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.175885 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.177903 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.178008 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.180040 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533460-hb9dm"] Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.275338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmnq\" (UniqueName: \"kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq\") pod \"auto-csr-approver-29533460-hb9dm\" (UID: \"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2\") " pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.377115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmnq\" (UniqueName: \"kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq\") pod \"auto-csr-approver-29533460-hb9dm\" (UID: \"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2\") " pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.413777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmnq\" (UniqueName: \"kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq\") pod \"auto-csr-approver-29533460-hb9dm\" (UID: \"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2\") " pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:00 crc kubenswrapper[4749]: I0225 08:20:00.497554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:01 crc kubenswrapper[4749]: I0225 08:20:01.012020 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533460-hb9dm"] Feb 25 08:20:01 crc kubenswrapper[4749]: I0225 08:20:01.022377 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:20:01 crc kubenswrapper[4749]: I0225 08:20:01.650663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" event={"ID":"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2","Type":"ContainerStarted","Data":"c2dcef2249a8b8e91f61fa457a7d65f2a26e9eca98cc0bf64474f367b2114159"} Feb 25 08:20:02 crc kubenswrapper[4749]: I0225 08:20:02.666180 4749 generic.go:334] "Generic (PLEG): container finished" podID="a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" containerID="e65a025c15eb5dfa88d81c43993789ea8fd98371726bee340b21f3f5936fca1a" exitCode=0 Feb 25 08:20:02 crc kubenswrapper[4749]: I0225 08:20:02.666439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" event={"ID":"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2","Type":"ContainerDied","Data":"e65a025c15eb5dfa88d81c43993789ea8fd98371726bee340b21f3f5936fca1a"} Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.910908 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.921847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.944207 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.963557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.963657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:03 crc kubenswrapper[4749]: I0225 08:20:03.963685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vtm\" (UniqueName: \"kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.065197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.065259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62vtm\" (UniqueName: \"kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.065368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.065809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.065831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.085517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vtm\" (UniqueName: \"kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm\") pod \"community-operators-8djr9\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.152658 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.262489 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.268232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcmnq\" (UniqueName: \"kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq\") pod \"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2\" (UID: \"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2\") " Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.273800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq" (OuterVolumeSpecName: "kube-api-access-vcmnq") pod "a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" (UID: "a56de3a7-6733-46ef-a34d-1e1f6fdee6d2"). InnerVolumeSpecName "kube-api-access-vcmnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.370050 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcmnq\" (UniqueName: \"kubernetes.io/projected/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2-kube-api-access-vcmnq\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.688403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" event={"ID":"a56de3a7-6733-46ef-a34d-1e1f6fdee6d2","Type":"ContainerDied","Data":"c2dcef2249a8b8e91f61fa457a7d65f2a26e9eca98cc0bf64474f367b2114159"} Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.688756 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2dcef2249a8b8e91f61fa457a7d65f2a26e9eca98cc0bf64474f367b2114159" Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.688453 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533460-hb9dm" Feb 25 08:20:04 crc kubenswrapper[4749]: W0225 08:20:04.758137 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4065f8b3_1589_47be_ad74_a4eb97462531.slice/crio-99c8b34789030ab55628c2d9a847a71fafb84ada67a7c17efe7412170fdfcb50 WatchSource:0}: Error finding container 99c8b34789030ab55628c2d9a847a71fafb84ada67a7c17efe7412170fdfcb50: Status 404 returned error can't find the container with id 99c8b34789030ab55628c2d9a847a71fafb84ada67a7c17efe7412170fdfcb50 Feb 25 08:20:04 crc kubenswrapper[4749]: I0225 08:20:04.764053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.247858 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533454-rnssz"] Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.258358 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533454-rnssz"] Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.347485 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e37364d-70a4-4d07-8fb8-18168563fc1e" path="/var/lib/kubelet/pods/4e37364d-70a4-4d07-8fb8-18168563fc1e/volumes" Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.709183 4749 generic.go:334] "Generic (PLEG): container finished" podID="4065f8b3-1589-47be-ad74-a4eb97462531" containerID="267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32" exitCode=0 Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.709271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerDied","Data":"267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32"} Feb 25 08:20:05 crc kubenswrapper[4749]: I0225 08:20:05.709348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerStarted","Data":"99c8b34789030ab55628c2d9a847a71fafb84ada67a7c17efe7412170fdfcb50"} Feb 25 08:20:06 crc kubenswrapper[4749]: I0225 08:20:06.727031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerStarted","Data":"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd"} Feb 25 08:20:08 crc kubenswrapper[4749]: I0225 08:20:08.755708 4749 generic.go:334] "Generic (PLEG): container finished" podID="4065f8b3-1589-47be-ad74-a4eb97462531" containerID="c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd" exitCode=0 Feb 25 08:20:08 crc kubenswrapper[4749]: I0225 08:20:08.755983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerDied","Data":"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd"} Feb 25 08:20:09 crc kubenswrapper[4749]: I0225 08:20:09.767558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerStarted","Data":"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c"} Feb 25 08:20:09 crc kubenswrapper[4749]: I0225 08:20:09.804083 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8djr9" podStartSLOduration=3.331878332 podStartE2EDuration="6.804059198s" podCreationTimestamp="2026-02-25 08:20:03 +0000 UTC" firstStartedPulling="2026-02-25 08:20:05.71278066 +0000 UTC m=+3759.074606720" lastFinishedPulling="2026-02-25 08:20:09.184961526 +0000 UTC m=+3762.546787586" observedRunningTime="2026-02-25 08:20:09.801085806 +0000 UTC m=+3763.162911866" watchObservedRunningTime="2026-02-25 08:20:09.804059198 +0000 UTC m=+3763.165885258" Feb 25 08:20:14 crc kubenswrapper[4749]: I0225 08:20:14.264377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:14 crc kubenswrapper[4749]: I0225 08:20:14.266588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:14 crc kubenswrapper[4749]: I0225 08:20:14.390181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:14 crc kubenswrapper[4749]: I0225 08:20:14.921124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:14 crc kubenswrapper[4749]: I0225 08:20:14.985040 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:16 crc kubenswrapper[4749]: I0225 08:20:16.869649 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8djr9" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="registry-server" containerID="cri-o://91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c" gracePeriod=2 Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.456030 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.576472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62vtm\" (UniqueName: \"kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm\") pod \"4065f8b3-1589-47be-ad74-a4eb97462531\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.576745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content\") pod \"4065f8b3-1589-47be-ad74-a4eb97462531\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.576800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities\") pod \"4065f8b3-1589-47be-ad74-a4eb97462531\" (UID: \"4065f8b3-1589-47be-ad74-a4eb97462531\") " Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.579931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities" (OuterVolumeSpecName: "utilities") pod "4065f8b3-1589-47be-ad74-a4eb97462531" (UID: "4065f8b3-1589-47be-ad74-a4eb97462531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.586709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm" (OuterVolumeSpecName: "kube-api-access-62vtm") pod "4065f8b3-1589-47be-ad74-a4eb97462531" (UID: "4065f8b3-1589-47be-ad74-a4eb97462531"). InnerVolumeSpecName "kube-api-access-62vtm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.630559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4065f8b3-1589-47be-ad74-a4eb97462531" (UID: "4065f8b3-1589-47be-ad74-a4eb97462531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.680311 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62vtm\" (UniqueName: \"kubernetes.io/projected/4065f8b3-1589-47be-ad74-a4eb97462531-kube-api-access-62vtm\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.680356 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.680374 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4065f8b3-1589-47be-ad74-a4eb97462531-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.886899 4749 generic.go:334] "Generic (PLEG): container finished" podID="4065f8b3-1589-47be-ad74-a4eb97462531" containerID="91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c" exitCode=0 Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.886962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerDied","Data":"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c"} Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.887012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8djr9" event={"ID":"4065f8b3-1589-47be-ad74-a4eb97462531","Type":"ContainerDied","Data":"99c8b34789030ab55628c2d9a847a71fafb84ada67a7c17efe7412170fdfcb50"} Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.887043 4749 scope.go:117] "RemoveContainer" containerID="91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.889846 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8djr9" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.921229 4749 scope.go:117] "RemoveContainer" containerID="c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.967059 4749 scope.go:117] "RemoveContainer" containerID="267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32" Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.968972 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:17 crc kubenswrapper[4749]: I0225 08:20:17.978075 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8djr9"] Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.027526 4749 scope.go:117] "RemoveContainer" containerID="91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c" Feb 25 08:20:18 crc kubenswrapper[4749]: E0225 08:20:18.028086 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c\": container with ID starting with 91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c not found: ID does not exist" containerID="91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c" Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.028152 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c"} err="failed to get container status \"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c\": rpc error: code = NotFound desc = could not find container \"91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c\": container with ID starting with 91ff6f63bc880c17574d0585dc6d538e990f42e650673d9a9785cc80de55758c not found: ID does not exist" Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.028192 4749 scope.go:117] "RemoveContainer" containerID="c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd" Feb 25 08:20:18 crc kubenswrapper[4749]: E0225 08:20:18.029193 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd\": container with ID starting with c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd not found: ID does not exist" containerID="c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd" Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.029235 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd"} err="failed to get container status \"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd\": rpc error: code = NotFound desc = could not find container \"c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd\": container with ID starting with c0d5d4e3eb086e01cde57a4f3d3b24fea95314c13b700d2f09a681f2bc6ed2fd not found: ID does not exist" Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.029258 4749 scope.go:117] "RemoveContainer" containerID="267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32" Feb 25 08:20:18 crc kubenswrapper[4749]: E0225 08:20:18.029763 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32\": container with ID starting with 267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32 not found: ID does not exist" containerID="267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32" Feb 25 08:20:18 crc kubenswrapper[4749]: I0225 08:20:18.029805 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32"} err="failed to get container status \"267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32\": rpc error: code = NotFound desc = could not find container \"267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32\": container with ID starting with 267238ae8212ecf8fca4c8f154ffd39bd3cd5791321029ac2a616264d794bb32 not found: ID does not exist" Feb 25 08:20:19 crc kubenswrapper[4749]: I0225 08:20:19.345675 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" path="/var/lib/kubelet/pods/4065f8b3-1589-47be-ad74-a4eb97462531/volumes" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.671765 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.672228 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.672272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.672846 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.672923 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" gracePeriod=600 Feb 25 08:20:21 crc kubenswrapper[4749]: E0225 08:20:21.797707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.948204 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" exitCode=0 Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.948259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a"} Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.948297 4749 scope.go:117] "RemoveContainer" containerID="b21a3072b1c49200736f9f3d1429a961139402ad2ca61dd97ae50157ff679b18" Feb 25 08:20:21 crc kubenswrapper[4749]: I0225 08:20:21.949017 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:20:21 crc kubenswrapper[4749]: E0225 08:20:21.949304 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:20:28 crc kubenswrapper[4749]: I0225 08:20:28.026768 4749 generic.go:334] "Generic (PLEG): container finished" podID="420840d3-bf89-4f03-85b0-924e79badf6a" containerID="c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153" exitCode=0 Feb 25 08:20:28 crc kubenswrapper[4749]: I0225 08:20:28.026934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" event={"ID":"420840d3-bf89-4f03-85b0-924e79badf6a","Type":"ContainerDied","Data":"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153"} Feb 25 08:20:28 crc kubenswrapper[4749]: I0225 08:20:28.028202 4749 scope.go:117] "RemoveContainer" containerID="c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153" Feb 25 08:20:28 crc kubenswrapper[4749]: I0225 08:20:28.649562 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5mdj_must-gather-tq4w6_420840d3-bf89-4f03-85b0-924e79badf6a/gather/0.log" Feb 25 08:20:36 crc kubenswrapper[4749]: I0225 08:20:36.323318 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:20:36 crc kubenswrapper[4749]: E0225 08:20:36.324745 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.405365 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z5mdj/must-gather-tq4w6"] Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.405973 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="copy" containerID="cri-o://07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296" gracePeriod=2 Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.416565 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z5mdj/must-gather-tq4w6"] Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.862982 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5mdj_must-gather-tq4w6_420840d3-bf89-4f03-85b0-924e79badf6a/copy/0.log" Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.863654 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.964548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output\") pod \"420840d3-bf89-4f03-85b0-924e79badf6a\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.964676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdwv\" (UniqueName: \"kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv\") pod \"420840d3-bf89-4f03-85b0-924e79badf6a\" (UID: \"420840d3-bf89-4f03-85b0-924e79badf6a\") " Feb 25 08:20:37 crc kubenswrapper[4749]: I0225 08:20:37.970550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv" (OuterVolumeSpecName: "kube-api-access-9hdwv") pod "420840d3-bf89-4f03-85b0-924e79badf6a" (UID: "420840d3-bf89-4f03-85b0-924e79badf6a"). InnerVolumeSpecName "kube-api-access-9hdwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.067059 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdwv\" (UniqueName: \"kubernetes.io/projected/420840d3-bf89-4f03-85b0-924e79badf6a-kube-api-access-9hdwv\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.125823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "420840d3-bf89-4f03-85b0-924e79badf6a" (UID: "420840d3-bf89-4f03-85b0-924e79badf6a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.161116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z5mdj_must-gather-tq4w6_420840d3-bf89-4f03-85b0-924e79badf6a/copy/0.log" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.162012 4749 generic.go:334] "Generic (PLEG): container finished" podID="420840d3-bf89-4f03-85b0-924e79badf6a" containerID="07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296" exitCode=143 Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.162094 4749 scope.go:117] "RemoveContainer" containerID="07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.162216 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z5mdj/must-gather-tq4w6" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.169888 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420840d3-bf89-4f03-85b0-924e79badf6a-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.196311 4749 scope.go:117] "RemoveContainer" containerID="c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.277488 4749 scope.go:117] "RemoveContainer" containerID="07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296" Feb 25 08:20:38 crc kubenswrapper[4749]: E0225 08:20:38.278174 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296\": container with ID starting with 07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296 not found: ID does not exist" containerID="07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.278252 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296"} err="failed to get container status \"07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296\": rpc error: code = NotFound desc = could not find container \"07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296\": container with ID starting with 07f4815e3c38721824ab0e2b229aaad8cc933858e7ced18c550b47551715a296 not found: ID does not exist" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.278300 4749 scope.go:117] "RemoveContainer" containerID="c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153" Feb 25 08:20:38 crc kubenswrapper[4749]: E0225 08:20:38.278880 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153\": container with ID starting with c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153 not found: ID does not exist" containerID="c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153" Feb 25 08:20:38 crc kubenswrapper[4749]: I0225 08:20:38.278942 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153"} err="failed to get container status \"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153\": rpc error: code = NotFound desc = could not find container \"c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153\": container with ID starting with c6afa3d946eeda5be2cc793b19b80f3b870d34ccced2b0dc6976d5b678740153 not found: ID does not exist" Feb 25 08:20:39 crc kubenswrapper[4749]: I0225 08:20:39.333748 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" path="/var/lib/kubelet/pods/420840d3-bf89-4f03-85b0-924e79badf6a/volumes" Feb 25 08:20:51 crc kubenswrapper[4749]: I0225 08:20:51.323250 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:20:51 crc kubenswrapper[4749]: E0225 08:20:51.324346 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:20:54 crc kubenswrapper[4749]: I0225 08:20:54.331788 4749 scope.go:117] "RemoveContainer" containerID="461097fad66298170b4945c3002aed9220c77fda47246e29760e1f2b711523c1" Feb 25 08:21:06 crc kubenswrapper[4749]: I0225 08:21:06.322471 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:21:06 crc kubenswrapper[4749]: E0225 08:21:06.323664 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:21:17 crc kubenswrapper[4749]: I0225 08:21:17.333736 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:21:17 crc kubenswrapper[4749]: E0225 08:21:17.334870 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:21:28 crc kubenswrapper[4749]: I0225 08:21:28.322833 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:21:28 crc kubenswrapper[4749]: E0225 08:21:28.323962 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:21:43 crc kubenswrapper[4749]: I0225 08:21:43.323143 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:21:43 crc kubenswrapper[4749]: E0225 08:21:43.324180 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:21:54 crc kubenswrapper[4749]: I0225 08:21:54.435819 4749 scope.go:117] "RemoveContainer" containerID="893190b29f7a24bb6f87ac9780ae5edc46480d1078aad18b13ed51e665cd240c" Feb 25 08:21:55 crc kubenswrapper[4749]: I0225 08:21:55.322636 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:21:55 crc kubenswrapper[4749]: E0225 08:21:55.323483 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.153681 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533462-cjwpr"] Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="gather" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154665 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="gather" Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="registry-server" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="registry-server" Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="extract-utilities" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154736 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="extract-utilities" Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154767 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" containerName="oc" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154776 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" containerName="oc" Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154787 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="copy" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154795 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="copy" Feb 25 08:22:00 crc kubenswrapper[4749]: E0225 08:22:00.154810 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="extract-content" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.154818 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="extract-content" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.155032 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4065f8b3-1589-47be-ad74-a4eb97462531" containerName="registry-server" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.155061 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="gather" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.155120 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" containerName="oc" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.155146 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="420840d3-bf89-4f03-85b0-924e79badf6a" containerName="copy" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.155946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.158022 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.159250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.165608 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.176843 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533462-cjwpr"] Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.256499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rhzl\" (UniqueName: \"kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl\") pod \"auto-csr-approver-29533462-cjwpr\" (UID: \"39c24884-942b-413b-a8d2-e1fed54ab1d3\") " pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.359336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rhzl\" (UniqueName: \"kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl\") pod \"auto-csr-approver-29533462-cjwpr\" (UID: \"39c24884-942b-413b-a8d2-e1fed54ab1d3\") " pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.391236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rhzl\" (UniqueName: \"kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl\") pod \"auto-csr-approver-29533462-cjwpr\" (UID: \"39c24884-942b-413b-a8d2-e1fed54ab1d3\") " pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.478072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:00 crc kubenswrapper[4749]: I0225 08:22:00.991182 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533462-cjwpr"] Feb 25 08:22:01 crc kubenswrapper[4749]: I0225 08:22:01.082197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" event={"ID":"39c24884-942b-413b-a8d2-e1fed54ab1d3","Type":"ContainerStarted","Data":"aede29e997fcf443bbf07e9324a8789086abde131616f8c7b2662eabf8ab4e2c"} Feb 25 08:22:03 crc kubenswrapper[4749]: I0225 08:22:03.107831 4749 generic.go:334] "Generic (PLEG): container finished" podID="39c24884-942b-413b-a8d2-e1fed54ab1d3" containerID="c267f93506dffde9c89235856bf8a0616279974bd8a5929fdc29f8afae19ac28" exitCode=0 Feb 25 08:22:03 crc kubenswrapper[4749]: I0225 08:22:03.107904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" event={"ID":"39c24884-942b-413b-a8d2-e1fed54ab1d3","Type":"ContainerDied","Data":"c267f93506dffde9c89235856bf8a0616279974bd8a5929fdc29f8afae19ac28"} Feb 25 08:22:04 crc kubenswrapper[4749]: I0225 08:22:04.494682 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:04 crc kubenswrapper[4749]: I0225 08:22:04.645515 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rhzl\" (UniqueName: \"kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl\") pod \"39c24884-942b-413b-a8d2-e1fed54ab1d3\" (UID: \"39c24884-942b-413b-a8d2-e1fed54ab1d3\") " Feb 25 08:22:04 crc kubenswrapper[4749]: I0225 08:22:04.655408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl" (OuterVolumeSpecName: "kube-api-access-2rhzl") pod "39c24884-942b-413b-a8d2-e1fed54ab1d3" (UID: "39c24884-942b-413b-a8d2-e1fed54ab1d3"). InnerVolumeSpecName "kube-api-access-2rhzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:22:04 crc kubenswrapper[4749]: I0225 08:22:04.749102 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rhzl\" (UniqueName: \"kubernetes.io/projected/39c24884-942b-413b-a8d2-e1fed54ab1d3-kube-api-access-2rhzl\") on node \"crc\" DevicePath \"\"" Feb 25 08:22:05 crc kubenswrapper[4749]: I0225 08:22:05.134913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" event={"ID":"39c24884-942b-413b-a8d2-e1fed54ab1d3","Type":"ContainerDied","Data":"aede29e997fcf443bbf07e9324a8789086abde131616f8c7b2662eabf8ab4e2c"} Feb 25 08:22:05 crc kubenswrapper[4749]: I0225 08:22:05.135296 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aede29e997fcf443bbf07e9324a8789086abde131616f8c7b2662eabf8ab4e2c" Feb 25 08:22:05 crc kubenswrapper[4749]: I0225 08:22:05.134968 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533462-cjwpr" Feb 25 08:22:05 crc kubenswrapper[4749]: I0225 08:22:05.594585 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533456-qlk2b"] Feb 25 08:22:05 crc kubenswrapper[4749]: I0225 08:22:05.605867 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533456-qlk2b"] Feb 25 08:22:06 crc kubenswrapper[4749]: I0225 08:22:06.323303 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:22:06 crc kubenswrapper[4749]: E0225 08:22:06.323645 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:22:07 crc kubenswrapper[4749]: I0225 08:22:07.335480 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02af7096-2076-47c6-81c4-57cb8e3c0c31" path="/var/lib/kubelet/pods/02af7096-2076-47c6-81c4-57cb8e3c0c31/volumes" Feb 25 08:22:20 crc kubenswrapper[4749]: I0225 08:22:20.322839 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:22:20 crc kubenswrapper[4749]: E0225 08:22:20.323980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:22:35 crc kubenswrapper[4749]: I0225 08:22:35.322492 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:22:35 crc kubenswrapper[4749]: E0225 08:22:35.323506 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:22:50 crc kubenswrapper[4749]: I0225 08:22:50.322778 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:22:50 crc kubenswrapper[4749]: E0225 08:22:50.326070 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:22:54 crc kubenswrapper[4749]: I0225 08:22:54.520898 4749 scope.go:117] "RemoveContainer" containerID="1e3a9a928ef7db64390c5e24814632685a0662fdfaf9b83d7bc68d6ad697d0d8" Feb 25 08:23:04 crc kubenswrapper[4749]: I0225 08:23:04.322720 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:23:04 crc kubenswrapper[4749]: E0225 08:23:04.324083 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:23:18 crc kubenswrapper[4749]: I0225 08:23:18.323279 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:23:18 crc kubenswrapper[4749]: E0225 08:23:18.324366 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:23:32 crc kubenswrapper[4749]: I0225 08:23:32.322979 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:23:32 crc kubenswrapper[4749]: E0225 08:23:32.325545 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.520976 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4j5k9/must-gather-xcbht"] Feb 25 08:23:37 crc kubenswrapper[4749]: E0225 08:23:37.521778 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c24884-942b-413b-a8d2-e1fed54ab1d3" containerName="oc" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.521790 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c24884-942b-413b-a8d2-e1fed54ab1d3" containerName="oc" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.521974 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c24884-942b-413b-a8d2-e1fed54ab1d3" containerName="oc" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.522867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.528574 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4j5k9"/"openshift-service-ca.crt" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.528743 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4j5k9"/"kube-root-ca.crt" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.553118 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4j5k9/must-gather-xcbht"] Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.558842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtlq\" (UniqueName: \"kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.558936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.661075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjtlq\" (UniqueName: \"kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.661436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.662188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.704573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjtlq\" (UniqueName: \"kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq\") pod \"must-gather-xcbht\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:37 crc kubenswrapper[4749]: I0225 08:23:37.853749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:23:38 crc kubenswrapper[4749]: I0225 08:23:38.288266 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4j5k9/must-gather-xcbht"] Feb 25 08:23:38 crc kubenswrapper[4749]: W0225 08:23:38.294843 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c08d2a9_92d6_4587_b813_d098e010a208.slice/crio-d9729dfc8ff63bd201fb428b99968b26a05bdd9fa5387e443adbc48e47a55458 WatchSource:0}: Error finding container d9729dfc8ff63bd201fb428b99968b26a05bdd9fa5387e443adbc48e47a55458: Status 404 returned error can't find the container with id d9729dfc8ff63bd201fb428b99968b26a05bdd9fa5387e443adbc48e47a55458 Feb 25 08:23:38 crc kubenswrapper[4749]: I0225 08:23:38.428030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/must-gather-xcbht" event={"ID":"3c08d2a9-92d6-4587-b813-d098e010a208","Type":"ContainerStarted","Data":"d9729dfc8ff63bd201fb428b99968b26a05bdd9fa5387e443adbc48e47a55458"} Feb 25 08:23:39 crc kubenswrapper[4749]: I0225 08:23:39.444736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/must-gather-xcbht" event={"ID":"3c08d2a9-92d6-4587-b813-d098e010a208","Type":"ContainerStarted","Data":"70559ef78ba644fc16a321373a6bba4acc2a10ae4261bdde82ac5896e8519f58"} Feb 25 08:23:39 crc kubenswrapper[4749]: I0225 08:23:39.445186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/must-gather-xcbht" event={"ID":"3c08d2a9-92d6-4587-b813-d098e010a208","Type":"ContainerStarted","Data":"34b8e98960c51106f67a9a7b0fcf614e4fdacad206435a10043510c1f503fa8a"} Feb 25 08:23:39 crc kubenswrapper[4749]: I0225 08:23:39.473774 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4j5k9/must-gather-xcbht" podStartSLOduration=2.47374246 podStartE2EDuration="2.47374246s" podCreationTimestamp="2026-02-25 08:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 08:23:39.470338746 +0000 UTC m=+3972.832164806" watchObservedRunningTime="2026-02-25 08:23:39.47374246 +0000 UTC m=+3972.835568520" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.185490 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-qvbbx"] Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.187358 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.189635 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4j5k9"/"default-dockercfg-9gcg5" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.247189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntq9\" (UniqueName: \"kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.247268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.349121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntq9\" (UniqueName: \"kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.349242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.349399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.377656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntq9\" (UniqueName: \"kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9\") pod \"crc-debug-qvbbx\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:42 crc kubenswrapper[4749]: I0225 08:23:42.510890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:23:43 crc kubenswrapper[4749]: I0225 08:23:43.501392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" event={"ID":"20105aa2-a98a-4005-b43e-f3d843447b55","Type":"ContainerStarted","Data":"42959be7dd88a9e6f6a244938d551be5fcf56c6a0aefe51a9fe4ecf93859f1f1"} Feb 25 08:23:43 crc kubenswrapper[4749]: I0225 08:23:43.502084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" event={"ID":"20105aa2-a98a-4005-b43e-f3d843447b55","Type":"ContainerStarted","Data":"13c6d8c333c32a307803446938b622e1e2831fe217ab6b345605e40a681eb906"} Feb 25 08:23:43 crc kubenswrapper[4749]: I0225 08:23:43.520168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" podStartSLOduration=1.520151987 podStartE2EDuration="1.520151987s" podCreationTimestamp="2026-02-25 08:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 08:23:43.519629474 +0000 UTC m=+3976.881455494" watchObservedRunningTime="2026-02-25 08:23:43.520151987 +0000 UTC m=+3976.881978007" Feb 25 08:23:45 crc kubenswrapper[4749]: I0225 08:23:45.322732 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:23:45 crc kubenswrapper[4749]: E0225 08:23:45.323385 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:23:57 crc kubenswrapper[4749]: I0225 08:23:57.330655 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:23:57 crc kubenswrapper[4749]: E0225 08:23:57.331456 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.152616 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533464-9gqjf"] Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.155349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.158096 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.158225 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.158293 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.166655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533464-9gqjf"] Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.289204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hqsb\" (UniqueName: \"kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb\") pod \"auto-csr-approver-29533464-9gqjf\" (UID: \"251b11d2-54f9-4aad-8ab8-e9f984c331ef\") " pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.391383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hqsb\" (UniqueName: \"kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb\") pod \"auto-csr-approver-29533464-9gqjf\" (UID: \"251b11d2-54f9-4aad-8ab8-e9f984c331ef\") " pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.411257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hqsb\" (UniqueName: \"kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb\") pod \"auto-csr-approver-29533464-9gqjf\" (UID: \"251b11d2-54f9-4aad-8ab8-e9f984c331ef\") " pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:00 crc kubenswrapper[4749]: I0225 08:24:00.476904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:01 crc kubenswrapper[4749]: I0225 08:24:01.049157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533464-9gqjf"] Feb 25 08:24:01 crc kubenswrapper[4749]: I0225 08:24:01.685307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" event={"ID":"251b11d2-54f9-4aad-8ab8-e9f984c331ef","Type":"ContainerStarted","Data":"0945331fb1649baf05dd5df6ca92eaebe62a6e0364d5238095c41f954797c6f7"} Feb 25 08:24:02 crc kubenswrapper[4749]: I0225 08:24:02.694169 4749 generic.go:334] "Generic (PLEG): container finished" podID="251b11d2-54f9-4aad-8ab8-e9f984c331ef" containerID="a7ce5101122fc5ed4d68321b33b4e8f99201a5157b61ef122dd8b93332583db9" exitCode=0 Feb 25 08:24:02 crc kubenswrapper[4749]: I0225 08:24:02.694246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" event={"ID":"251b11d2-54f9-4aad-8ab8-e9f984c331ef","Type":"ContainerDied","Data":"a7ce5101122fc5ed4d68321b33b4e8f99201a5157b61ef122dd8b93332583db9"} Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.087232 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.268872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hqsb\" (UniqueName: \"kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb\") pod \"251b11d2-54f9-4aad-8ab8-e9f984c331ef\" (UID: \"251b11d2-54f9-4aad-8ab8-e9f984c331ef\") " Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.273978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb" (OuterVolumeSpecName: "kube-api-access-9hqsb") pod "251b11d2-54f9-4aad-8ab8-e9f984c331ef" (UID: "251b11d2-54f9-4aad-8ab8-e9f984c331ef"). InnerVolumeSpecName "kube-api-access-9hqsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.371643 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hqsb\" (UniqueName: \"kubernetes.io/projected/251b11d2-54f9-4aad-8ab8-e9f984c331ef-kube-api-access-9hqsb\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.715620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" event={"ID":"251b11d2-54f9-4aad-8ab8-e9f984c331ef","Type":"ContainerDied","Data":"0945331fb1649baf05dd5df6ca92eaebe62a6e0364d5238095c41f954797c6f7"} Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.716016 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0945331fb1649baf05dd5df6ca92eaebe62a6e0364d5238095c41f954797c6f7" Feb 25 08:24:04 crc kubenswrapper[4749]: I0225 08:24:04.715687 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533464-9gqjf" Feb 25 08:24:05 crc kubenswrapper[4749]: I0225 08:24:05.169153 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533458-fps56"] Feb 25 08:24:05 crc kubenswrapper[4749]: I0225 08:24:05.181378 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533458-fps56"] Feb 25 08:24:05 crc kubenswrapper[4749]: I0225 08:24:05.338150 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0474ed4a-c9df-415c-abfe-a652ba550427" path="/var/lib/kubelet/pods/0474ed4a-c9df-415c-abfe-a652ba550427/volumes" Feb 25 08:24:10 crc kubenswrapper[4749]: I0225 08:24:10.323004 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:24:10 crc kubenswrapper[4749]: E0225 08:24:10.323754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:24:14 crc kubenswrapper[4749]: I0225 08:24:14.811204 4749 generic.go:334] "Generic (PLEG): container finished" podID="20105aa2-a98a-4005-b43e-f3d843447b55" containerID="42959be7dd88a9e6f6a244938d551be5fcf56c6a0aefe51a9fe4ecf93859f1f1" exitCode=0 Feb 25 08:24:14 crc kubenswrapper[4749]: I0225 08:24:14.811282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" event={"ID":"20105aa2-a98a-4005-b43e-f3d843447b55","Type":"ContainerDied","Data":"42959be7dd88a9e6f6a244938d551be5fcf56c6a0aefe51a9fe4ecf93859f1f1"} Feb 25 08:24:15 crc kubenswrapper[4749]: I0225 08:24:15.937051 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:24:15 crc kubenswrapper[4749]: I0225 08:24:15.977549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-qvbbx"] Feb 25 08:24:15 crc kubenswrapper[4749]: I0225 08:24:15.984978 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-qvbbx"] Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.086195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host\") pod \"20105aa2-a98a-4005-b43e-f3d843447b55\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.086395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host" (OuterVolumeSpecName: "host") pod "20105aa2-a98a-4005-b43e-f3d843447b55" (UID: "20105aa2-a98a-4005-b43e-f3d843447b55"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.086420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jntq9\" (UniqueName: \"kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9\") pod \"20105aa2-a98a-4005-b43e-f3d843447b55\" (UID: \"20105aa2-a98a-4005-b43e-f3d843447b55\") " Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.087708 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20105aa2-a98a-4005-b43e-f3d843447b55-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.092468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9" (OuterVolumeSpecName: "kube-api-access-jntq9") pod "20105aa2-a98a-4005-b43e-f3d843447b55" (UID: "20105aa2-a98a-4005-b43e-f3d843447b55"). InnerVolumeSpecName "kube-api-access-jntq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.189537 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jntq9\" (UniqueName: \"kubernetes.io/projected/20105aa2-a98a-4005-b43e-f3d843447b55-kube-api-access-jntq9\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.829692 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c6d8c333c32a307803446938b622e1e2831fe217ab6b345605e40a681eb906" Feb 25 08:24:16 crc kubenswrapper[4749]: I0225 08:24:16.829738 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-qvbbx" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.247265 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-rlt6z"] Feb 25 08:24:17 crc kubenswrapper[4749]: E0225 08:24:17.247734 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20105aa2-a98a-4005-b43e-f3d843447b55" containerName="container-00" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.247747 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="20105aa2-a98a-4005-b43e-f3d843447b55" containerName="container-00" Feb 25 08:24:17 crc kubenswrapper[4749]: E0225 08:24:17.247765 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251b11d2-54f9-4aad-8ab8-e9f984c331ef" containerName="oc" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.247772 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="251b11d2-54f9-4aad-8ab8-e9f984c331ef" containerName="oc" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.247995 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="20105aa2-a98a-4005-b43e-f3d843447b55" containerName="container-00" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.248011 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="251b11d2-54f9-4aad-8ab8-e9f984c331ef" containerName="oc" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.248624 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.251862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4j5k9"/"default-dockercfg-9gcg5" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.319662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.319786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzh5\" (UniqueName: \"kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.335035 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20105aa2-a98a-4005-b43e-f3d843447b55" path="/var/lib/kubelet/pods/20105aa2-a98a-4005-b43e-f3d843447b55/volumes" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.421961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.422083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzh5\" (UniqueName: \"kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.422177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.446156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzh5\" (UniqueName: \"kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5\") pod \"crc-debug-rlt6z\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.569272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:17 crc kubenswrapper[4749]: I0225 08:24:17.843637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" event={"ID":"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd","Type":"ContainerStarted","Data":"622d125ae33f51bae729c28051483bc3342158f9efcd7dfa79edb0b4317e48d9"} Feb 25 08:24:18 crc kubenswrapper[4749]: I0225 08:24:18.854867 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" containerID="2689ca73641f6832cff8f6f34d1e7483bd14525de1166be062e6607203aaa815" exitCode=0 Feb 25 08:24:18 crc kubenswrapper[4749]: I0225 08:24:18.854924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" event={"ID":"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd","Type":"ContainerDied","Data":"2689ca73641f6832cff8f6f34d1e7483bd14525de1166be062e6607203aaa815"} Feb 25 08:24:19 crc kubenswrapper[4749]: I0225 08:24:19.218806 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-rlt6z"] Feb 25 08:24:19 crc kubenswrapper[4749]: I0225 08:24:19.226966 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-rlt6z"] Feb 25 08:24:19 crc kubenswrapper[4749]: I0225 08:24:19.962781 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.069303 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host\") pod \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.069378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mzh5\" (UniqueName: \"kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5\") pod \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\" (UID: \"ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd\") " Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.069411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host" (OuterVolumeSpecName: "host") pod "ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" (UID: "ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.070062 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.074637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5" (OuterVolumeSpecName: "kube-api-access-9mzh5") pod "ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" (UID: "ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd"). InnerVolumeSpecName "kube-api-access-9mzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.170922 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mzh5\" (UniqueName: \"kubernetes.io/projected/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd-kube-api-access-9mzh5\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.435333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-5kmm7"] Feb 25 08:24:20 crc kubenswrapper[4749]: E0225 08:24:20.435708 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" containerName="container-00" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.435725 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" containerName="container-00" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.435892 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" containerName="container-00" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.436486 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.476497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcwn\" (UniqueName: \"kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.476578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.577626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcwn\" (UniqueName: \"kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.577750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.577917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.594071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcwn\" (UniqueName: \"kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn\") pod \"crc-debug-5kmm7\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.753516 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:20 crc kubenswrapper[4749]: W0225 08:24:20.778821 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1a75a8_0b82_4774_9da8_27a60bc73c88.slice/crio-2173acd2810a8882178a8c70e6b6351e8be48d11511ecfd132a06a9f4b5368a7 WatchSource:0}: Error finding container 2173acd2810a8882178a8c70e6b6351e8be48d11511ecfd132a06a9f4b5368a7: Status 404 returned error can't find the container with id 2173acd2810a8882178a8c70e6b6351e8be48d11511ecfd132a06a9f4b5368a7 Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.872665 4749 scope.go:117] "RemoveContainer" containerID="2689ca73641f6832cff8f6f34d1e7483bd14525de1166be062e6607203aaa815" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.872690 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-rlt6z" Feb 25 08:24:20 crc kubenswrapper[4749]: I0225 08:24:20.874223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" event={"ID":"ee1a75a8-0b82-4774-9da8-27a60bc73c88","Type":"ContainerStarted","Data":"2173acd2810a8882178a8c70e6b6351e8be48d11511ecfd132a06a9f4b5368a7"} Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.322776 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:24:21 crc kubenswrapper[4749]: E0225 08:24:21.323123 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.334486 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd" path="/var/lib/kubelet/pods/ff38b7d9-a8b2-4eae-8ac6-7eeaed4555dd/volumes" Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.883279 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee1a75a8-0b82-4774-9da8-27a60bc73c88" containerID="99979f0718dce16d5ffbec910fc016500a76628b2f7e380373a838ee16a850e8" exitCode=0 Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.883346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" event={"ID":"ee1a75a8-0b82-4774-9da8-27a60bc73c88","Type":"ContainerDied","Data":"99979f0718dce16d5ffbec910fc016500a76628b2f7e380373a838ee16a850e8"} Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.924557 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-5kmm7"] Feb 25 08:24:21 crc kubenswrapper[4749]: I0225 08:24:21.934961 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4j5k9/crc-debug-5kmm7"] Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.033786 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.228028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrcwn\" (UniqueName: \"kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn\") pod \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.228093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host\") pod \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\" (UID: \"ee1a75a8-0b82-4774-9da8-27a60bc73c88\") " Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.228401 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host" (OuterVolumeSpecName: "host") pod "ee1a75a8-0b82-4774-9da8-27a60bc73c88" (UID: "ee1a75a8-0b82-4774-9da8-27a60bc73c88"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.228570 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee1a75a8-0b82-4774-9da8-27a60bc73c88-host\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.245257 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn" (OuterVolumeSpecName: "kube-api-access-zrcwn") pod "ee1a75a8-0b82-4774-9da8-27a60bc73c88" (UID: "ee1a75a8-0b82-4774-9da8-27a60bc73c88"). InnerVolumeSpecName "kube-api-access-zrcwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.330676 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrcwn\" (UniqueName: \"kubernetes.io/projected/ee1a75a8-0b82-4774-9da8-27a60bc73c88-kube-api-access-zrcwn\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.331776 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1a75a8-0b82-4774-9da8-27a60bc73c88" path="/var/lib/kubelet/pods/ee1a75a8-0b82-4774-9da8-27a60bc73c88/volumes" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.912173 4749 scope.go:117] "RemoveContainer" containerID="99979f0718dce16d5ffbec910fc016500a76628b2f7e380373a838ee16a850e8" Feb 25 08:24:23 crc kubenswrapper[4749]: I0225 08:24:23.912205 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/crc-debug-5kmm7" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.437574 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:33 crc kubenswrapper[4749]: E0225 08:24:33.439330 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1a75a8-0b82-4774-9da8-27a60bc73c88" containerName="container-00" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.439408 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1a75a8-0b82-4774-9da8-27a60bc73c88" containerName="container-00" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.439710 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1a75a8-0b82-4774-9da8-27a60bc73c88" containerName="container-00" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.441099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.470469 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.534034 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.534239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsrx4\" (UniqueName: \"kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.534310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.635871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.636051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsrx4\" (UniqueName: \"kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.636118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.636707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.637140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.653222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsrx4\" (UniqueName: \"kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4\") pod \"certified-operators-x8vd6\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:33 crc kubenswrapper[4749]: I0225 08:24:33.767994 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:34 crc kubenswrapper[4749]: I0225 08:24:34.297378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:35 crc kubenswrapper[4749]: I0225 08:24:35.044298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerStarted","Data":"8f3e822728b792bf1a162807c901d5fe039cb84888fccf469ac64a23eb8d279b"} Feb 25 08:24:36 crc kubenswrapper[4749]: I0225 08:24:36.056412 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerID="d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d" exitCode=0 Feb 25 08:24:36 crc kubenswrapper[4749]: I0225 08:24:36.056521 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerDied","Data":"d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d"} Feb 25 08:24:36 crc kubenswrapper[4749]: I0225 08:24:36.322895 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:24:36 crc kubenswrapper[4749]: E0225 08:24:36.323270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:24:37 crc kubenswrapper[4749]: I0225 08:24:37.066506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerStarted","Data":"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b"} Feb 25 08:24:38 crc kubenswrapper[4749]: I0225 08:24:38.080147 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerID="00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b" exitCode=0 Feb 25 08:24:38 crc kubenswrapper[4749]: I0225 08:24:38.080355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerDied","Data":"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b"} Feb 25 08:24:39 crc kubenswrapper[4749]: I0225 08:24:39.102458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerStarted","Data":"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885"} Feb 25 08:24:39 crc kubenswrapper[4749]: I0225 08:24:39.128075 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x8vd6" podStartSLOduration=3.729035547 podStartE2EDuration="6.128055423s" podCreationTimestamp="2026-02-25 08:24:33 +0000 UTC" firstStartedPulling="2026-02-25 08:24:36.0583051 +0000 UTC m=+4029.420131160" lastFinishedPulling="2026-02-25 08:24:38.457325016 +0000 UTC m=+4031.819151036" observedRunningTime="2026-02-25 08:24:39.125003999 +0000 UTC m=+4032.486830029" watchObservedRunningTime="2026-02-25 08:24:39.128055423 +0000 UTC m=+4032.489881453" Feb 25 08:24:43 crc kubenswrapper[4749]: I0225 08:24:43.768308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:43 crc kubenswrapper[4749]: I0225 08:24:43.768853 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:43 crc kubenswrapper[4749]: I0225 08:24:43.819508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:44 crc kubenswrapper[4749]: I0225 08:24:44.206944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:44 crc kubenswrapper[4749]: I0225 08:24:44.293000 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.156103 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x8vd6" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="registry-server" containerID="cri-o://9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885" gracePeriod=2 Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.613335 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.723640 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities\") pod \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.724214 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content\") pod \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.724452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsrx4\" (UniqueName: \"kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4\") pod \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\" (UID: \"3d008f16-3782-4d3e-b377-baabbb2ae4c4\") " Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.725418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities" (OuterVolumeSpecName: "utilities") pod "3d008f16-3782-4d3e-b377-baabbb2ae4c4" (UID: "3d008f16-3782-4d3e-b377-baabbb2ae4c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.734224 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4" (OuterVolumeSpecName: "kube-api-access-vsrx4") pod "3d008f16-3782-4d3e-b377-baabbb2ae4c4" (UID: "3d008f16-3782-4d3e-b377-baabbb2ae4c4"). InnerVolumeSpecName "kube-api-access-vsrx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.788777 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d008f16-3782-4d3e-b377-baabbb2ae4c4" (UID: "3d008f16-3782-4d3e-b377-baabbb2ae4c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.826810 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsrx4\" (UniqueName: \"kubernetes.io/projected/3d008f16-3782-4d3e-b377-baabbb2ae4c4-kube-api-access-vsrx4\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.826841 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:46 crc kubenswrapper[4749]: I0225 08:24:46.826851 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d008f16-3782-4d3e-b377-baabbb2ae4c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.165821 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerID="9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885" exitCode=0 Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.165890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerDied","Data":"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885"} Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.165930 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x8vd6" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.166920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x8vd6" event={"ID":"3d008f16-3782-4d3e-b377-baabbb2ae4c4","Type":"ContainerDied","Data":"8f3e822728b792bf1a162807c901d5fe039cb84888fccf469ac64a23eb8d279b"} Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.166937 4749 scope.go:117] "RemoveContainer" containerID="9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.184011 4749 scope.go:117] "RemoveContainer" containerID="00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.208739 4749 scope.go:117] "RemoveContainer" containerID="d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.216425 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.227751 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x8vd6"] Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.242993 4749 scope.go:117] "RemoveContainer" containerID="9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885" Feb 25 08:24:47 crc kubenswrapper[4749]: E0225 08:24:47.243507 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885\": container with ID starting with 9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885 not found: ID does not exist" containerID="9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.243565 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885"} err="failed to get container status \"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885\": rpc error: code = NotFound desc = could not find container \"9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885\": container with ID starting with 9ba5c22ec43917c6a498d43dd95f1ae480255e09dee0f438e58fc5cd018fa885 not found: ID does not exist" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.243647 4749 scope.go:117] "RemoveContainer" containerID="00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b" Feb 25 08:24:47 crc kubenswrapper[4749]: E0225 08:24:47.244082 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b\": container with ID starting with 00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b not found: ID does not exist" containerID="00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.244115 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b"} err="failed to get container status \"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b\": rpc error: code = NotFound desc = could not find container \"00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b\": container with ID starting with 00f49e1eb62f364a73d920a85a4b525b59c377d77c6e274054f1285877b1bd1b not found: ID does not exist" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.244138 4749 scope.go:117] "RemoveContainer" containerID="d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d" Feb 25 08:24:47 crc kubenswrapper[4749]: E0225 08:24:47.244363 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d\": container with ID starting with d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d not found: ID does not exist" containerID="d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.244383 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d"} err="failed to get container status \"d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d\": rpc error: code = NotFound desc = could not find container \"d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d\": container with ID starting with d5b983cfd364bbbaf59f294ae388fa314adb5474b4966289f06af9c5f305501d not found: ID does not exist" Feb 25 08:24:47 crc kubenswrapper[4749]: I0225 08:24:47.337469 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" path="/var/lib/kubelet/pods/3d008f16-3782-4d3e-b377-baabbb2ae4c4/volumes" Feb 25 08:24:48 crc kubenswrapper[4749]: I0225 08:24:48.322662 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:24:48 crc kubenswrapper[4749]: E0225 08:24:48.324390 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.678526 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:24:49 crc kubenswrapper[4749]: E0225 08:24:49.679095 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="extract-content" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.679111 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="extract-content" Feb 25 08:24:49 crc kubenswrapper[4749]: E0225 08:24:49.679137 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="registry-server" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.679144 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="registry-server" Feb 25 08:24:49 crc kubenswrapper[4749]: E0225 08:24:49.679159 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="extract-utilities" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.679167 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="extract-utilities" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.679416 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d008f16-3782-4d3e-b377-baabbb2ae4c4" containerName="registry-server" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.681146 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.690322 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.781205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.781572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.781733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqqms\" (UniqueName: \"kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.883071 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.883156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqqms\" (UniqueName: \"kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.883220 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.883525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.885308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:49 crc kubenswrapper[4749]: I0225 08:24:49.904621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqqms\" (UniqueName: \"kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms\") pod \"redhat-operators-776zd\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:50 crc kubenswrapper[4749]: I0225 08:24:50.016113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:24:50 crc kubenswrapper[4749]: I0225 08:24:50.508029 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:24:51 crc kubenswrapper[4749]: I0225 08:24:51.202286 4749 generic.go:334] "Generic (PLEG): container finished" podID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerID="f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04" exitCode=0 Feb 25 08:24:51 crc kubenswrapper[4749]: I0225 08:24:51.202329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerDied","Data":"f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04"} Feb 25 08:24:51 crc kubenswrapper[4749]: I0225 08:24:51.202357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerStarted","Data":"2a4d7eafa57f6c94831c19892e97b3c36e1ac01e9531017bc642168a683f5fee"} Feb 25 08:24:52 crc kubenswrapper[4749]: I0225 08:24:52.224206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerStarted","Data":"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47"} Feb 25 08:24:53 crc kubenswrapper[4749]: I0225 08:24:53.239089 4749 generic.go:334] "Generic (PLEG): container finished" podID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerID="e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47" exitCode=0 Feb 25 08:24:53 crc kubenswrapper[4749]: I0225 08:24:53.239179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerDied","Data":"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47"} Feb 25 08:24:54 crc kubenswrapper[4749]: I0225 08:24:54.251281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerStarted","Data":"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754"} Feb 25 08:24:54 crc kubenswrapper[4749]: I0225 08:24:54.286473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-776zd" podStartSLOduration=2.814151751 podStartE2EDuration="5.286456519s" podCreationTimestamp="2026-02-25 08:24:49 +0000 UTC" firstStartedPulling="2026-02-25 08:24:51.204621022 +0000 UTC m=+4044.566447042" lastFinishedPulling="2026-02-25 08:24:53.67692578 +0000 UTC m=+4047.038751810" observedRunningTime="2026-02-25 08:24:54.274654162 +0000 UTC m=+4047.636480192" watchObservedRunningTime="2026-02-25 08:24:54.286456519 +0000 UTC m=+4047.648282549" Feb 25 08:24:54 crc kubenswrapper[4749]: I0225 08:24:54.653060 4749 scope.go:117] "RemoveContainer" containerID="8a5204df938b3f64d23f598e19e8b44d7de36c38e0bbb8de2e39a028aa99ed8a" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.161869 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b57cbd48-btws7_35a9ddc1-5f7b-4a22-8b8f-45b895c6731c/barbican-api/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.340339 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6b57cbd48-btws7_35a9ddc1-5f7b-4a22-8b8f-45b895c6731c/barbican-api-log/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.399311 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d5c5c5846-h56g5_004d8426-4842-4f64-ba76-6ee6afed85de/barbican-keystone-listener/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.422316 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5d5c5c5846-h56g5_004d8426-4842-4f64-ba76-6ee6afed85de/barbican-keystone-listener-log/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.534695 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-df9d99775-z76dt_9111c168-e7cf-494e-a603-93b1f9db0b73/barbican-worker/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.582619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-df9d99775-z76dt_9111c168-e7cf-494e-a603-93b1f9db0b73/barbican-worker-log/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.744509 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-bjsnq_42b06635-d369-4299-92f6-e912f4d811df/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.817523 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/ceilometer-central-agent/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.867262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/ceilometer-notification-agent/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.957898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/proxy-httpd/0.log" Feb 25 08:24:55 crc kubenswrapper[4749]: I0225 08:24:55.976462 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b3e4586b-587a-4c8f-9387-70cb52411a46/sg-core/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.105500 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_65d2cd6c-9f4d-4d9d-9032-7798a57b7aec/cinder-api/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.170081 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_65d2cd6c-9f4d-4d9d-9032-7798a57b7aec/cinder-api-log/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.333886 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6ba75dd-bf4c-4d7e-88b3-cf11679c231a/probe/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.360503 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b6ba75dd-bf4c-4d7e-88b3-cf11679c231a/cinder-scheduler/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.409099 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vfqng_f2f01883-686b-4aed-9458-ee14d1c3eb10/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.531869 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6w4jk_4237ec4c-49e0-4c6d-8a5c-d67583610f3d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.604292 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/init/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.794010 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/init/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.857353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-lblvn_c18a0e33-8fc2-4653-8a90-39b356b71af2/dnsmasq-dns/0.log" Feb 25 08:24:56 crc kubenswrapper[4749]: I0225 08:24:56.894567 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pvhlc_fd2d74c2-5270-4697-b4fb-47a5affbbf68/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.076099 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3957039-d105-44aa-865d-08cf1bd562bf/glance-log/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.099993 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d3957039-d105-44aa-865d-08cf1bd562bf/glance-httpd/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.294537 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64d42008-7546-4307-9953-37a51af1df8a/glance-httpd/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.438586 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_64d42008-7546-4307-9953-37a51af1df8a/glance-log/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.446155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75597b5c88-58jkm_43b78d96-31fe-4729-aacf-09c66c121861/horizon/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.658913 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pfnbz_7e445ab6-1f18-49fe-b3f4-0921714e4d08/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.825057 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75597b5c88-58jkm_43b78d96-31fe-4729-aacf-09c66c121861/horizon-log/0.log" Feb 25 08:24:57 crc kubenswrapper[4749]: I0225 08:24:57.880755 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-vjvzc_2c3eb600-4864-4229-bbfc-6b24211fc914/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.067844 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-567ffd99f4-495rj_d2f908fa-2e8d-44bd-ac10-d745ce196bda/keystone-api/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.211661 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29533441-4v5xd_4b21ba16-5e25-41fd-afbb-82072bfdc006/keystone-cron/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.361122 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_11165cb9-7a9a-425b-8eea-42e61a784a57/kube-state-metrics/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.392705 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-mpvfw_04b421fd-689e-4212-85a9-ffaecfe63fbe/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.862791 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-668b654645-4xlm5_8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6/neutron-httpd/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.895207 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-668b654645-4xlm5_8b0a5c3f-66e0-445f-9405-f1d2bdefd8f6/neutron-api/0.log" Feb 25 08:24:58 crc kubenswrapper[4749]: I0225 08:24:58.967197 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-shr95_f5a0d4ab-e5e7-41c9-bb91-c89d59c6a92f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:24:59 crc kubenswrapper[4749]: I0225 08:24:59.525173 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ea427259-a5cd-455d-a3c3-7031a607e42c/nova-api-log/0.log" Feb 25 08:24:59 crc kubenswrapper[4749]: I0225 08:24:59.671714 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1bc34300-d817-4b83-865e-2cc2c5ffb31a/nova-cell0-conductor-conductor/0.log" Feb 25 08:24:59 crc kubenswrapper[4749]: I0225 08:24:59.965429 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e800eb5f-0979-4ad3-8f0a-1adab77e1259/nova-cell1-conductor-conductor/0.log" Feb 25 08:24:59 crc kubenswrapper[4749]: I0225 08:24:59.988131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ea427259-a5cd-455d-a3c3-7031a607e42c/nova-api-api/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.017663 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.017702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.047572 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e7c9f2e8-f03e-40a1-bfa8-ea1fc396ad08/nova-cell1-novncproxy-novncproxy/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.277023 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8b4782e-5a90-4774-819b-dc12f4c1b585/nova-metadata-log/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.312688 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l2dl2_9b501ea6-b5f9-497b-9da6-072e7a0fde7a/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.686518 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/mysql-bootstrap/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.709778 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_adbf2a7f-f028-4bd3-8db2-0b5f58f6fc79/nova-scheduler-scheduler/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.878069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/galera/0.log" Feb 25 08:25:00 crc kubenswrapper[4749]: I0225 08:25:00.895775 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_966f467d-732d-45df-b9d1-bb88be2e34cf/mysql-bootstrap/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.067923 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-776zd" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="registry-server" probeResult="failure" output=< Feb 25 08:25:01 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Feb 25 08:25:01 crc kubenswrapper[4749]: > Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.113096 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/mysql-bootstrap/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.294380 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/mysql-bootstrap/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.309105 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_725d8d83-d9a6-4c99-86f1-71371b41c11f/galera/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.513141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a7c13290-8c43-443a-8563-ea54a96c975a/openstackclient/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.513613 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9jt2g_4197a74b-885d-41df-8484-05e645656b2a/ovn-controller/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.569116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8b4782e-5a90-4774-819b-dc12f4c1b585/nova-metadata-metadata/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.716428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rtctt_cd7c992f-27dd-409a-a7db-34b40e2ed6eb/openstack-network-exporter/0.log" Feb 25 08:25:01 crc kubenswrapper[4749]: I0225 08:25:01.806770 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server-init/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.014288 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server-init/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.023393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovs-vswitchd/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.046085 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-c5k7v_75a58813-9b9d-4d38-aa91-527463cdbccf/ovsdb-server/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.221380 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d644537f-b5d7-4f78-be98-d61b2f1d6ac3/ovn-northd/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.247358 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d644537f-b5d7-4f78-be98-d61b2f1d6ac3/openstack-network-exporter/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.312250 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5nv2h_4671b264-81d8-4dfb-9bb9-33a1f2c46068/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.430002 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8fa832b-da81-48f1-b4d6-e72688303d93/openstack-network-exporter/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.468203 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8fa832b-da81-48f1-b4d6-e72688303d93/ovsdbserver-nb/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.630055 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_afd2522b-bd24-455e-bc5f-a62caba2ff23/openstack-network-exporter/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.724846 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_afd2522b-bd24-455e-bc5f-a62caba2ff23/ovsdbserver-sb/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.896569 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5578bc7b56-qlg29_90ab2780-2dee-40f9-a4c6-529e08d4de0b/placement-api/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.947374 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5578bc7b56-qlg29_90ab2780-2dee-40f9-a4c6-529e08d4de0b/placement-log/0.log" Feb 25 08:25:02 crc kubenswrapper[4749]: I0225 08:25:02.950482 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/setup-container/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.113174 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/setup-container/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.177512 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_21dc0c76-2b5c-43cd-93e0-9f85eb8b102d/rabbitmq/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.204903 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/setup-container/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.322375 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:25:03 crc kubenswrapper[4749]: E0225 08:25:03.322707 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.391030 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/setup-container/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.446522 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bddd8d20-d985-4c29-b82a-9bc75a6c40b9/rabbitmq/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.467298 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rkxtd_daf53e51-e69a-43fd-bfa4-50ffcd4c9234/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.655277 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hx724_9653d4b7-617d-4ca2-a596-3f4ab7086b05/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.706038 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zdspf_911f3de9-9115-4be2-98f8-a1e26e35387a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.995534 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nfvzb_11ec0661-d541-4c78-bc67-3bcb2e908694/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:03 crc kubenswrapper[4749]: I0225 08:25:03.996177 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-2w792_357114b3-24d4-4f6f-ba27-99c4314d110d/ssh-known-hosts-edpm-deployment/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.244945 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6846d6d889-85shz_ff8a7476-3a62-4af3-a5bb-8a4b8d60108f/proxy-server/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.350063 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6846d6d889-85shz_ff8a7476-3a62-4af3-a5bb-8a4b8d60108f/proxy-httpd/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.394927 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tw69n_9d9bb0e5-da1f-480e-8a59-e00767290acc/swift-ring-rebalance/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.432604 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-auditor/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.540688 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-reaper/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.597741 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-server/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.598032 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/account-replicator/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.610421 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-auditor/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.782272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-replicator/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.794999 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-server/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.819037 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/container-updater/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.869818 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-auditor/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.944434 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-expirer/0.log" Feb 25 08:25:04 crc kubenswrapper[4749]: I0225 08:25:04.997786 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-server/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.039833 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-replicator/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.052507 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/object-updater/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.109237 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/rsync/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.188660 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4664b9ba-7c02-431a-89c5-715d216ee127/swift-recon-cron/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.329672 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mnmxq_18d17a7c-1c9d-47bc-818d-c2f567dfe075/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.426960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6ffbba01-9e0c-4754-a378-68eaba4c858e/tempest-tests-tempest-tests-runner/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.520067 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_262268cf-ce8f-4780-9acc-a642f473b902/test-operator-logs-container/0.log" Feb 25 08:25:05 crc kubenswrapper[4749]: I0225 08:25:05.620709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-42z9m_e2e2f83c-2c2c-41d2-83ca-9fa2ab3f53cf/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 25 08:25:10 crc kubenswrapper[4749]: I0225 08:25:10.058426 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:10 crc kubenswrapper[4749]: I0225 08:25:10.107903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:10 crc kubenswrapper[4749]: I0225 08:25:10.290384 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.365026 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-776zd" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="registry-server" containerID="cri-o://8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754" gracePeriod=2 Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.821405 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.952705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqqms\" (UniqueName: \"kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms\") pod \"30e04496-19f8-4b25-ab5b-7b005b227e1f\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.952762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities\") pod \"30e04496-19f8-4b25-ab5b-7b005b227e1f\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.952835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content\") pod \"30e04496-19f8-4b25-ab5b-7b005b227e1f\" (UID: \"30e04496-19f8-4b25-ab5b-7b005b227e1f\") " Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.954005 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities" (OuterVolumeSpecName: "utilities") pod "30e04496-19f8-4b25-ab5b-7b005b227e1f" (UID: "30e04496-19f8-4b25-ab5b-7b005b227e1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:25:11 crc kubenswrapper[4749]: I0225 08:25:11.957980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms" (OuterVolumeSpecName: "kube-api-access-cqqms") pod "30e04496-19f8-4b25-ab5b-7b005b227e1f" (UID: "30e04496-19f8-4b25-ab5b-7b005b227e1f"). InnerVolumeSpecName "kube-api-access-cqqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.054715 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqqms\" (UniqueName: \"kubernetes.io/projected/30e04496-19f8-4b25-ab5b-7b005b227e1f-kube-api-access-cqqms\") on node \"crc\" DevicePath \"\"" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.054742 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.078336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30e04496-19f8-4b25-ab5b-7b005b227e1f" (UID: "30e04496-19f8-4b25-ab5b-7b005b227e1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.156099 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30e04496-19f8-4b25-ab5b-7b005b227e1f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.386819 4749 generic.go:334] "Generic (PLEG): container finished" podID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerID="8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754" exitCode=0 Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.386865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerDied","Data":"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754"} Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.386892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-776zd" event={"ID":"30e04496-19f8-4b25-ab5b-7b005b227e1f","Type":"ContainerDied","Data":"2a4d7eafa57f6c94831c19892e97b3c36e1ac01e9531017bc642168a683f5fee"} Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.386910 4749 scope.go:117] "RemoveContainer" containerID="8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.386912 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-776zd" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.418136 4749 scope.go:117] "RemoveContainer" containerID="e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.423158 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.439958 4749 scope.go:117] "RemoveContainer" containerID="f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.443675 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-776zd"] Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.476854 4749 scope.go:117] "RemoveContainer" containerID="8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754" Feb 25 08:25:12 crc kubenswrapper[4749]: E0225 08:25:12.477436 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754\": container with ID starting with 8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754 not found: ID does not exist" containerID="8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.477556 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754"} err="failed to get container status \"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754\": rpc error: code = NotFound desc = could not find container \"8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754\": container with ID starting with 8e2af2c5b8b56c9f6b3c4f3d83f74342d012849b0f45bb8000c72cd3d5ea0754 not found: ID does not exist" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.477731 4749 scope.go:117] "RemoveContainer" containerID="e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47" Feb 25 08:25:12 crc kubenswrapper[4749]: E0225 08:25:12.478298 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47\": container with ID starting with e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47 not found: ID does not exist" containerID="e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.478467 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47"} err="failed to get container status \"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47\": rpc error: code = NotFound desc = could not find container \"e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47\": container with ID starting with e144d6b887e5e458f658c5eff037c8e23e9565794bb4d1ded1601dd1494c5c47 not found: ID does not exist" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.478632 4749 scope.go:117] "RemoveContainer" containerID="f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04" Feb 25 08:25:12 crc kubenswrapper[4749]: E0225 08:25:12.479206 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04\": container with ID starting with f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04 not found: ID does not exist" containerID="f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04" Feb 25 08:25:12 crc kubenswrapper[4749]: I0225 08:25:12.479365 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04"} err="failed to get container status \"f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04\": rpc error: code = NotFound desc = could not find container \"f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04\": container with ID starting with f44a44458219172cc5b18ef1e27bfe5cca9b09e3b75f7f1c660fac86b2da7a04 not found: ID does not exist" Feb 25 08:25:13 crc kubenswrapper[4749]: I0225 08:25:13.331450 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" path="/var/lib/kubelet/pods/30e04496-19f8-4b25-ab5b-7b005b227e1f/volumes" Feb 25 08:25:14 crc kubenswrapper[4749]: I0225 08:25:14.322270 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:25:14 crc kubenswrapper[4749]: E0225 08:25:14.322637 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:25:14 crc kubenswrapper[4749]: I0225 08:25:14.535180 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_df3ffcd0-1fa9-4c25-a331-33baf2a3acfd/memcached/0.log" Feb 25 08:25:26 crc kubenswrapper[4749]: I0225 08:25:26.322867 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:25:27 crc kubenswrapper[4749]: I0225 08:25:27.540569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187"} Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.281259 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.441990 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.450461 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.519527 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.634007 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/util/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.648017 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/pull/0.log" Feb 25 08:25:31 crc kubenswrapper[4749]: I0225 08:25:31.683547 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0d601b1457cf6343d944ca339ab0782737039759b755cb529cadcbc6b44nx8t_08be220c-12a9-4f49-b123-7a1413328415/extract/0.log" Feb 25 08:25:32 crc kubenswrapper[4749]: I0225 08:25:32.182842 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-4jdng_a8412981-280e-4153-b15e-7a5df751e110/manager/0.log" Feb 25 08:25:32 crc kubenswrapper[4749]: I0225 08:25:32.552952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-7h9cc_23e28440-3881-4d38-885e-3a20842b117d/manager/0.log" Feb 25 08:25:32 crc kubenswrapper[4749]: I0225 08:25:32.693244 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-9tv6m_dc40a3f9-e350-41dc-b13d-86ae3e46f551/manager/0.log" Feb 25 08:25:32 crc kubenswrapper[4749]: I0225 08:25:32.929408 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-2tccs_0a7bf49d-b6d9-474b-b97c-cb555aa93f8a/manager/0.log" Feb 25 08:25:33 crc kubenswrapper[4749]: I0225 08:25:33.444332 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-5nbpj_a8e8e364-0b06-4b3d-9faf-4c7c3c233060/manager/0.log" Feb 25 08:25:33 crc kubenswrapper[4749]: I0225 08:25:33.533045 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-rtn6b_b1ea3312-6b21-440a-b617-5681d286bcc4/manager/0.log" Feb 25 08:25:33 crc kubenswrapper[4749]: I0225 08:25:33.856398 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-zjvss_17bb447b-e6f8-4a04-98ea-8559cbd26d34/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.082397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-b2j2v_0594f0f5-d0ce-4c43-8572-3dc16130152e/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.273697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-9jh8w_0d2964a7-7341-4f1f-ab51-b648ea057535/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.317764 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-g4h2l_b394d64a-2cf2-4cad-9b51-adbf56cb696c/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.537870 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-sptf9_b54d6e5b-b74d-4e25-808a-20383b1b02e0/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.615699 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-q64h7_c881d86e-332d-419d-8c8a-9b7dfafe8c3c/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.741689 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-2lrvt_9fcf9db9-6abb-445d-aa8c-8d5e60431838/manager/0.log" Feb 25 08:25:34 crc kubenswrapper[4749]: I0225 08:25:34.902935 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9c6xd5c_6f55e1ee-705c-409e-b34a-77232bf089eb/manager/0.log" Feb 25 08:25:35 crc kubenswrapper[4749]: I0225 08:25:35.176728 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57b9579b7-7f7fh_3effb281-c000-4831-89a1-8b85dfc219b3/operator/0.log" Feb 25 08:25:35 crc kubenswrapper[4749]: I0225 08:25:35.397508 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tk29m_7b2662ca-97d0-4f81-b90a-c2735bd2a62a/registry-server/0.log" Feb 25 08:25:35 crc kubenswrapper[4749]: I0225 08:25:35.632393 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-n4v8h_e1c9da55-b9bc-4bb7-b73a-ca73c928f333/manager/0.log" Feb 25 08:25:35 crc kubenswrapper[4749]: I0225 08:25:35.757532 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-5gt8h_9bae09cb-1d81-4971-82bc-84d87a6dca77/manager/0.log" Feb 25 08:25:35 crc kubenswrapper[4749]: I0225 08:25:35.857835 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-vx8bh_d3e727f4-f059-41f3-94d1-fef7a644f2b2/operator/0.log" Feb 25 08:25:36 crc kubenswrapper[4749]: I0225 08:25:36.119065 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-nqhfv_e1152a21-82e3-4a7f-92c8-8633abeecb26/manager/0.log" Feb 25 08:25:36 crc kubenswrapper[4749]: I0225 08:25:36.224072 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-589c568786-wv258_6dc6924f-3556-4ac1-b787-57fa3e20297f/manager/0.log" Feb 25 08:25:36 crc kubenswrapper[4749]: I0225 08:25:36.319338 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-wh7zn_0ada39b5-592e-448e-9a12-2f9e95906a74/manager/0.log" Feb 25 08:25:36 crc kubenswrapper[4749]: I0225 08:25:36.515551 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-7fzsq_6b0526d5-8f6f-47d4-87f5-0deb2c091848/manager/0.log" Feb 25 08:25:36 crc kubenswrapper[4749]: I0225 08:25:36.934020 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6699bbbd4-8bgxl_37a9e86e-0ee0-4447-910a-a185f4681508/manager/0.log" Feb 25 08:25:41 crc kubenswrapper[4749]: I0225 08:25:41.559357 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-fn75h_12921b78-d19b-4a5c-be9e-5cf412b88186/manager/0.log" Feb 25 08:25:58 crc kubenswrapper[4749]: I0225 08:25:58.916895 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2rbvc_85efff27-fc96-4191-9733-a6a2434a723c/control-plane-machine-set-operator/0.log" Feb 25 08:25:59 crc kubenswrapper[4749]: I0225 08:25:59.523338 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mqgw7_723f3a22-e0cb-4b03-952d-7f4e6aece976/kube-rbac-proxy/0.log" Feb 25 08:25:59 crc kubenswrapper[4749]: I0225 08:25:59.550654 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mqgw7_723f3a22-e0cb-4b03-952d-7f4e6aece976/machine-api-operator/0.log" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.137838 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533466-brbfw"] Feb 25 08:26:00 crc kubenswrapper[4749]: E0225 08:26:00.138718 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="registry-server" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.138745 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="registry-server" Feb 25 08:26:00 crc kubenswrapper[4749]: E0225 08:26:00.138759 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="extract-content" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.138766 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="extract-content" Feb 25 08:26:00 crc kubenswrapper[4749]: E0225 08:26:00.138789 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="extract-utilities" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.138799 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="extract-utilities" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.139056 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e04496-19f8-4b25-ab5b-7b005b227e1f" containerName="registry-server" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.139878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.142006 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.142503 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.144093 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.146311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533466-brbfw"] Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.181739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fb6j\" (UniqueName: \"kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j\") pod \"auto-csr-approver-29533466-brbfw\" (UID: \"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc\") " pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.284168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fb6j\" (UniqueName: \"kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j\") pod \"auto-csr-approver-29533466-brbfw\" (UID: \"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc\") " pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.308917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fb6j\" (UniqueName: \"kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j\") pod \"auto-csr-approver-29533466-brbfw\" (UID: \"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc\") " pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.494092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:00 crc kubenswrapper[4749]: I0225 08:26:00.980092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533466-brbfw"] Feb 25 08:26:01 crc kubenswrapper[4749]: W0225 08:26:01.210998 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a76cbdb_bb28_447d_a3ab_954bbe80dcbc.slice/crio-c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5 WatchSource:0}: Error finding container c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5: Status 404 returned error can't find the container with id c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5 Feb 25 08:26:01 crc kubenswrapper[4749]: I0225 08:26:01.214300 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:26:01 crc kubenswrapper[4749]: I0225 08:26:01.877776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533466-brbfw" event={"ID":"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc","Type":"ContainerStarted","Data":"c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5"} Feb 25 08:26:02 crc kubenswrapper[4749]: I0225 08:26:02.892589 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" containerID="18304353c3ef96d37df16f0ab893beac21c2e61c98e603636eec1408cabf946a" exitCode=0 Feb 25 08:26:02 crc kubenswrapper[4749]: I0225 08:26:02.892667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533466-brbfw" event={"ID":"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc","Type":"ContainerDied","Data":"18304353c3ef96d37df16f0ab893beac21c2e61c98e603636eec1408cabf946a"} Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.264362 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.283243 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fb6j\" (UniqueName: \"kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j\") pod \"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc\" (UID: \"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc\") " Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.289211 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j" (OuterVolumeSpecName: "kube-api-access-2fb6j") pod "0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" (UID: "0a76cbdb-bb28-447d-a3ab-954bbe80dcbc"). InnerVolumeSpecName "kube-api-access-2fb6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.385747 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fb6j\" (UniqueName: \"kubernetes.io/projected/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc-kube-api-access-2fb6j\") on node \"crc\" DevicePath \"\"" Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.911069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533466-brbfw" event={"ID":"0a76cbdb-bb28-447d-a3ab-954bbe80dcbc","Type":"ContainerDied","Data":"c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5"} Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.911117 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533466-brbfw" Feb 25 08:26:04 crc kubenswrapper[4749]: I0225 08:26:04.911122 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c839e7963d0713129b0001580ed79849fe673a9c51e0734afed127b5b4c827c5" Feb 25 08:26:05 crc kubenswrapper[4749]: I0225 08:26:05.359541 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533460-hb9dm"] Feb 25 08:26:05 crc kubenswrapper[4749]: I0225 08:26:05.369413 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533460-hb9dm"] Feb 25 08:26:07 crc kubenswrapper[4749]: I0225 08:26:07.335713 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a56de3a7-6733-46ef-a34d-1e1f6fdee6d2" path="/var/lib/kubelet/pods/a56de3a7-6733-46ef-a34d-1e1f6fdee6d2/volumes" Feb 25 08:26:09 crc kubenswrapper[4749]: I0225 08:26:09.925102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:09 crc kubenswrapper[4749]: E0225 08:26:09.925854 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" containerName="oc" Feb 25 08:26:09 crc kubenswrapper[4749]: I0225 08:26:09.925869 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" containerName="oc" Feb 25 08:26:09 crc kubenswrapper[4749]: I0225 08:26:09.926049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" containerName="oc" Feb 25 08:26:09 crc kubenswrapper[4749]: I0225 08:26:09.927286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:09 crc kubenswrapper[4749]: I0225 08:26:09.935450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.009106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.009149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.009215 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258mt\" (UniqueName: \"kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.111530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.111563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.111608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258mt\" (UniqueName: \"kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.112029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.112146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.129109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258mt\" (UniqueName: \"kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt\") pod \"redhat-marketplace-b98zd\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.254099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.721848 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:10 crc kubenswrapper[4749]: I0225 08:26:10.982409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerStarted","Data":"db1a8e69fa633916a2a0d8f9149d574b0d3e3e5bca1186ec50593ff6a08f5a89"} Feb 25 08:26:12 crc kubenswrapper[4749]: I0225 08:26:12.003966 4749 generic.go:334] "Generic (PLEG): container finished" podID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerID="f8adb24cc2b90071fe652ea59e1738da5f87ca2115ae07acb41b5a3dfb05f150" exitCode=0 Feb 25 08:26:12 crc kubenswrapper[4749]: I0225 08:26:12.004035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerDied","Data":"f8adb24cc2b90071fe652ea59e1738da5f87ca2115ae07acb41b5a3dfb05f150"} Feb 25 08:26:14 crc kubenswrapper[4749]: I0225 08:26:14.056228 4749 generic.go:334] "Generic (PLEG): container finished" podID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerID="c9be9a6c60a72b22bdba1649b28f5a6a02a47f812e190c5bf7c37a04acf9a156" exitCode=0 Feb 25 08:26:14 crc kubenswrapper[4749]: I0225 08:26:14.056287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerDied","Data":"c9be9a6c60a72b22bdba1649b28f5a6a02a47f812e190c5bf7c37a04acf9a156"} Feb 25 08:26:14 crc kubenswrapper[4749]: I0225 08:26:14.555152 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-6858f_280567a7-b82f-4767-93b7-725ad0ff927e/cert-manager-controller/0.log" Feb 25 08:26:14 crc kubenswrapper[4749]: I0225 08:26:14.777564 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-h5fwj_acd8dc37-5680-4838-aa0e-bf79c4209283/cert-manager-webhook/0.log" Feb 25 08:26:14 crc kubenswrapper[4749]: I0225 08:26:14.803850 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-rtblf_e03b662d-9431-4217-b126-b2a7db9ab5e4/cert-manager-cainjector/0.log" Feb 25 08:26:15 crc kubenswrapper[4749]: I0225 08:26:15.066157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerStarted","Data":"56811f6711cb1e1555bf2f4b01d9e423c77200f5dec0b6161a774b2231a5d38f"} Feb 25 08:26:15 crc kubenswrapper[4749]: I0225 08:26:15.083484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b98zd" podStartSLOduration=3.659109141 podStartE2EDuration="6.083467618s" podCreationTimestamp="2026-02-25 08:26:09 +0000 UTC" firstStartedPulling="2026-02-25 08:26:12.006280053 +0000 UTC m=+4125.368106073" lastFinishedPulling="2026-02-25 08:26:14.43063853 +0000 UTC m=+4127.792464550" observedRunningTime="2026-02-25 08:26:15.080026905 +0000 UTC m=+4128.441852925" watchObservedRunningTime="2026-02-25 08:26:15.083467618 +0000 UTC m=+4128.445293628" Feb 25 08:26:20 crc kubenswrapper[4749]: I0225 08:26:20.254995 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:20 crc kubenswrapper[4749]: I0225 08:26:20.255448 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:20 crc kubenswrapper[4749]: I0225 08:26:20.313130 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:21 crc kubenswrapper[4749]: I0225 08:26:21.700181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:21 crc kubenswrapper[4749]: I0225 08:26:21.760850 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:23 crc kubenswrapper[4749]: I0225 08:26:23.155795 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b98zd" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="registry-server" containerID="cri-o://56811f6711cb1e1555bf2f4b01d9e423c77200f5dec0b6161a774b2231a5d38f" gracePeriod=2 Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.167634 4749 generic.go:334] "Generic (PLEG): container finished" podID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerID="56811f6711cb1e1555bf2f4b01d9e423c77200f5dec0b6161a774b2231a5d38f" exitCode=0 Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.167716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerDied","Data":"56811f6711cb1e1555bf2f4b01d9e423c77200f5dec0b6161a774b2231a5d38f"} Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.720498 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.855480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-258mt\" (UniqueName: \"kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt\") pod \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.855639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content\") pod \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.855661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities\") pod \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\" (UID: \"855af2f6-0ebc-4842-8ea0-5d87cac7f896\") " Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.860819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities" (OuterVolumeSpecName: "utilities") pod "855af2f6-0ebc-4842-8ea0-5d87cac7f896" (UID: "855af2f6-0ebc-4842-8ea0-5d87cac7f896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.862800 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt" (OuterVolumeSpecName: "kube-api-access-258mt") pod "855af2f6-0ebc-4842-8ea0-5d87cac7f896" (UID: "855af2f6-0ebc-4842-8ea0-5d87cac7f896"). InnerVolumeSpecName "kube-api-access-258mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.879275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "855af2f6-0ebc-4842-8ea0-5d87cac7f896" (UID: "855af2f6-0ebc-4842-8ea0-5d87cac7f896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.957144 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-258mt\" (UniqueName: \"kubernetes.io/projected/855af2f6-0ebc-4842-8ea0-5d87cac7f896-kube-api-access-258mt\") on node \"crc\" DevicePath \"\"" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.957178 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:26:24 crc kubenswrapper[4749]: I0225 08:26:24.957189 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/855af2f6-0ebc-4842-8ea0-5d87cac7f896-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.180997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b98zd" event={"ID":"855af2f6-0ebc-4842-8ea0-5d87cac7f896","Type":"ContainerDied","Data":"db1a8e69fa633916a2a0d8f9149d574b0d3e3e5bca1186ec50593ff6a08f5a89"} Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.181066 4749 scope.go:117] "RemoveContainer" containerID="56811f6711cb1e1555bf2f4b01d9e423c77200f5dec0b6161a774b2231a5d38f" Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.181179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b98zd" Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.212822 4749 scope.go:117] "RemoveContainer" containerID="c9be9a6c60a72b22bdba1649b28f5a6a02a47f812e190c5bf7c37a04acf9a156" Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.235221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.249736 4749 scope.go:117] "RemoveContainer" containerID="f8adb24cc2b90071fe652ea59e1738da5f87ca2115ae07acb41b5a3dfb05f150" Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.257358 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b98zd"] Feb 25 08:26:25 crc kubenswrapper[4749]: I0225 08:26:25.338363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" path="/var/lib/kubelet/pods/855af2f6-0ebc-4842-8ea0-5d87cac7f896/volumes" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.106985 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-gvgqz_3b819184-1695-4312-a0f4-0e0bad53a7d7/nmstate-console-plugin/0.log" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.229040 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-xpxjw_eef1bf93-8a75-4ff2-b172-7601b6861aef/nmstate-handler/0.log" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.294544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-wnqc6_7ba6588d-93c3-481d-8606-1b91fee0267a/kube-rbac-proxy/0.log" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.400276 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-wnqc6_7ba6588d-93c3-481d-8606-1b91fee0267a/nmstate-metrics/0.log" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.463141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-tpddj_956da30d-e0b4-45a1-a3b4-773cdada4e30/nmstate-operator/0.log" Feb 25 08:26:31 crc kubenswrapper[4749]: I0225 08:26:31.640406 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-6flq6_ed80efda-7ad1-4992-9d76-f94d50e57216/nmstate-webhook/0.log" Feb 25 08:26:54 crc kubenswrapper[4749]: I0225 08:26:54.851441 4749 scope.go:117] "RemoveContainer" containerID="e65a025c15eb5dfa88d81c43993789ea8fd98371726bee340b21f3f5936fca1a" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.399247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jz8zq_9c8c2184-5984-4567-9de5-0141b3fc7fcd/kube-rbac-proxy/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.424556 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-jz8zq_9c8c2184-5984-4567-9de5-0141b3fc7fcd/controller/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.594368 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.813217 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.823549 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.840688 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:27:04 crc kubenswrapper[4749]: I0225 08:27:04.842959 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.062430 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.068105 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.082282 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.086938 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.257154 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-metrics/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.261198 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-reloader/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.266656 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/cp-frr-files/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.292219 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/controller/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.437735 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/kube-rbac-proxy/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.513699 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/frr-metrics/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.525096 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/kube-rbac-proxy-frr/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.682984 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/reloader/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.779578 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-lh45s_3bb90660-64f8-43f6-b7c0-a4449f75c9fb/frr-k8s-webhook-server/0.log" Feb 25 08:27:05 crc kubenswrapper[4749]: I0225 08:27:05.947174 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56887f7db6-72j45_3ae3ac00-e47a-4cc0-ba56-9f0f885163ca/manager/0.log" Feb 25 08:27:06 crc kubenswrapper[4749]: I0225 08:27:06.097074 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-96dfd7b56-f9wq6_aacb357e-4983-4cb4-86df-6fd3119e8b15/webhook-server/0.log" Feb 25 08:27:06 crc kubenswrapper[4749]: I0225 08:27:06.683519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k7r4k_f9ece4c1-da53-4718-8a3a-aa9c0fd930da/kube-rbac-proxy/0.log" Feb 25 08:27:07 crc kubenswrapper[4749]: I0225 08:27:07.103132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-k7r4k_f9ece4c1-da53-4718-8a3a-aa9c0fd930da/speaker/0.log" Feb 25 08:27:07 crc kubenswrapper[4749]: I0225 08:27:07.170240 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8cg2t_82f32e7a-8ffc-4429-bfd7-a12db75ed64a/frr/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.064409 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.286452 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.308248 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.308927 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.484467 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/pull/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.489008 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/extract/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.490799 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc2132k5fr_64fb5ac6-97f4-4e82-9dea-8d439e3f1fd1/util/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.659951 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.859113 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.897069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:27:22 crc kubenswrapper[4749]: I0225 08:27:22.904980 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.132507 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-content/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.138891 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/extract-utilities/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.337294 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.557428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.593050 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.627256 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.692116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8bqdr_424795b7-1cab-4e2e-a3c6-a3d63283910c/registry-server/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.785422 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-utilities/0.log" Feb 25 08:27:23 crc kubenswrapper[4749]: I0225 08:27:23.787341 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/extract-content/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.047877 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.107632 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-67nt9_9d6bfe4a-4275-420c-a7e1-dd0ca3c3a2cc/registry-server/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.189054 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.222003 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.266052 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.421315 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/util/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.441481 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/pull/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.484420 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7m4qq_f03e9291-ddbb-4b8a-b96c-c66a604694d2/extract/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.759688 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pk7lm_e899a950-b7af-4fe2-b9db-856858e051fc/marketplace-operator/0.log" Feb 25 08:27:24 crc kubenswrapper[4749]: I0225 08:27:24.798131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.010645 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.061309 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.077082 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.242997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.270482 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/extract-content/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.416864 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5q9mx_2a5c6dae-0795-463f-ae4c-f2c3de61483a/registry-server/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.478739 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.611120 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.620713 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.664016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.840083 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-utilities/0.log" Feb 25 08:27:25 crc kubenswrapper[4749]: I0225 08:27:25.843054 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/extract-content/0.log" Feb 25 08:27:26 crc kubenswrapper[4749]: I0225 08:27:26.531342 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zzwzn_6a02c53a-d462-4339-9bf9-3dc9fbc48c71/registry-server/0.log" Feb 25 08:27:51 crc kubenswrapper[4749]: I0225 08:27:51.671667 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:27:51 crc kubenswrapper[4749]: I0225 08:27:51.672184 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:27:58 crc kubenswrapper[4749]: E0225 08:27:58.598711 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:54548->38.102.83.151:43635: write tcp 38.102.83.151:54548->38.102.83.151:43635: write: broken pipe Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.152893 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533468-bfpxp"] Feb 25 08:28:00 crc kubenswrapper[4749]: E0225 08:28:00.154157 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="extract-content" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.154248 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="extract-content" Feb 25 08:28:00 crc kubenswrapper[4749]: E0225 08:28:00.154328 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="registry-server" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.154385 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="registry-server" Feb 25 08:28:00 crc kubenswrapper[4749]: E0225 08:28:00.154442 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="extract-utilities" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.154497 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="extract-utilities" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.154770 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="855af2f6-0ebc-4842-8ea0-5d87cac7f896" containerName="registry-server" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.155500 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.161897 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.161905 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.162274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.185368 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533468-bfpxp"] Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.340706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drdlf\" (UniqueName: \"kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf\") pod \"auto-csr-approver-29533468-bfpxp\" (UID: \"97bc2ed9-8f20-4e8b-9102-e4654e698d28\") " pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.444030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drdlf\" (UniqueName: \"kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf\") pod \"auto-csr-approver-29533468-bfpxp\" (UID: \"97bc2ed9-8f20-4e8b-9102-e4654e698d28\") " pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.474805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drdlf\" (UniqueName: \"kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf\") pod \"auto-csr-approver-29533468-bfpxp\" (UID: \"97bc2ed9-8f20-4e8b-9102-e4654e698d28\") " pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:00 crc kubenswrapper[4749]: I0225 08:28:00.481286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:01 crc kubenswrapper[4749]: I0225 08:28:01.031506 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533468-bfpxp"] Feb 25 08:28:02 crc kubenswrapper[4749]: I0225 08:28:02.062172 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" event={"ID":"97bc2ed9-8f20-4e8b-9102-e4654e698d28","Type":"ContainerStarted","Data":"5836f3af75cc6ad4e7a342b6135e41340f1d1fb76f746b836a3833555b4c1f72"} Feb 25 08:28:03 crc kubenswrapper[4749]: I0225 08:28:03.074910 4749 generic.go:334] "Generic (PLEG): container finished" podID="97bc2ed9-8f20-4e8b-9102-e4654e698d28" containerID="9bc61358bdf6762fc7e4287cae8c6004767e4b4a8bc42e358e2fb22a1bdda9a6" exitCode=0 Feb 25 08:28:03 crc kubenswrapper[4749]: I0225 08:28:03.075091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" event={"ID":"97bc2ed9-8f20-4e8b-9102-e4654e698d28","Type":"ContainerDied","Data":"9bc61358bdf6762fc7e4287cae8c6004767e4b4a8bc42e358e2fb22a1bdda9a6"} Feb 25 08:28:04 crc kubenswrapper[4749]: I0225 08:28:04.527649 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:04 crc kubenswrapper[4749]: I0225 08:28:04.625946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drdlf\" (UniqueName: \"kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf\") pod \"97bc2ed9-8f20-4e8b-9102-e4654e698d28\" (UID: \"97bc2ed9-8f20-4e8b-9102-e4654e698d28\") " Feb 25 08:28:04 crc kubenswrapper[4749]: I0225 08:28:04.648478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf" (OuterVolumeSpecName: "kube-api-access-drdlf") pod "97bc2ed9-8f20-4e8b-9102-e4654e698d28" (UID: "97bc2ed9-8f20-4e8b-9102-e4654e698d28"). InnerVolumeSpecName "kube-api-access-drdlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:28:04 crc kubenswrapper[4749]: I0225 08:28:04.729314 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drdlf\" (UniqueName: \"kubernetes.io/projected/97bc2ed9-8f20-4e8b-9102-e4654e698d28-kube-api-access-drdlf\") on node \"crc\" DevicePath \"\"" Feb 25 08:28:05 crc kubenswrapper[4749]: I0225 08:28:05.095452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" event={"ID":"97bc2ed9-8f20-4e8b-9102-e4654e698d28","Type":"ContainerDied","Data":"5836f3af75cc6ad4e7a342b6135e41340f1d1fb76f746b836a3833555b4c1f72"} Feb 25 08:28:05 crc kubenswrapper[4749]: I0225 08:28:05.095733 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5836f3af75cc6ad4e7a342b6135e41340f1d1fb76f746b836a3833555b4c1f72" Feb 25 08:28:05 crc kubenswrapper[4749]: I0225 08:28:05.095689 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533468-bfpxp" Feb 25 08:28:05 crc kubenswrapper[4749]: I0225 08:28:05.635295 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533462-cjwpr"] Feb 25 08:28:05 crc kubenswrapper[4749]: I0225 08:28:05.648485 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533462-cjwpr"] Feb 25 08:28:07 crc kubenswrapper[4749]: I0225 08:28:07.340873 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c24884-942b-413b-a8d2-e1fed54ab1d3" path="/var/lib/kubelet/pods/39c24884-942b-413b-a8d2-e1fed54ab1d3/volumes" Feb 25 08:28:21 crc kubenswrapper[4749]: I0225 08:28:21.671423 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:28:21 crc kubenswrapper[4749]: I0225 08:28:21.672142 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:28:51 crc kubenswrapper[4749]: I0225 08:28:51.671908 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:28:51 crc kubenswrapper[4749]: I0225 08:28:51.672493 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:28:51 crc kubenswrapper[4749]: I0225 08:28:51.672534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:28:51 crc kubenswrapper[4749]: I0225 08:28:51.673336 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:28:51 crc kubenswrapper[4749]: I0225 08:28:51.673381 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187" gracePeriod=600 Feb 25 08:28:52 crc kubenswrapper[4749]: I0225 08:28:52.776164 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187" exitCode=0 Feb 25 08:28:52 crc kubenswrapper[4749]: I0225 08:28:52.776250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187"} Feb 25 08:28:52 crc kubenswrapper[4749]: I0225 08:28:52.777043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerStarted","Data":"d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174"} Feb 25 08:28:52 crc kubenswrapper[4749]: I0225 08:28:52.777085 4749 scope.go:117] "RemoveContainer" containerID="d74c9469e04e0b4441c6594b571f4c7721a0d7a8afec49a9ea96e7d03d1d4c9a" Feb 25 08:28:54 crc kubenswrapper[4749]: I0225 08:28:54.988277 4749 scope.go:117] "RemoveContainer" containerID="c267f93506dffde9c89235856bf8a0616279974bd8a5929fdc29f8afae19ac28" Feb 25 08:29:13 crc kubenswrapper[4749]: I0225 08:29:13.067829 4749 generic.go:334] "Generic (PLEG): container finished" podID="3c08d2a9-92d6-4587-b813-d098e010a208" containerID="34b8e98960c51106f67a9a7b0fcf614e4fdacad206435a10043510c1f503fa8a" exitCode=0 Feb 25 08:29:13 crc kubenswrapper[4749]: I0225 08:29:13.067917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4j5k9/must-gather-xcbht" event={"ID":"3c08d2a9-92d6-4587-b813-d098e010a208","Type":"ContainerDied","Data":"34b8e98960c51106f67a9a7b0fcf614e4fdacad206435a10043510c1f503fa8a"} Feb 25 08:29:13 crc kubenswrapper[4749]: I0225 08:29:13.070238 4749 scope.go:117] "RemoveContainer" containerID="34b8e98960c51106f67a9a7b0fcf614e4fdacad206435a10043510c1f503fa8a" Feb 25 08:29:13 crc kubenswrapper[4749]: I0225 08:29:13.173547 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4j5k9_must-gather-xcbht_3c08d2a9-92d6-4587-b813-d098e010a208/gather/0.log" Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.004160 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4j5k9/must-gather-xcbht"] Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.004935 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4j5k9/must-gather-xcbht" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="copy" containerID="cri-o://70559ef78ba644fc16a321373a6bba4acc2a10ae4261bdde82ac5896e8519f58" gracePeriod=2 Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.016248 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4j5k9/must-gather-xcbht"] Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.205139 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4j5k9_must-gather-xcbht_3c08d2a9-92d6-4587-b813-d098e010a208/copy/0.log" Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.205847 4749 generic.go:334] "Generic (PLEG): container finished" podID="3c08d2a9-92d6-4587-b813-d098e010a208" containerID="70559ef78ba644fc16a321373a6bba4acc2a10ae4261bdde82ac5896e8519f58" exitCode=143 Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.470888 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4j5k9_must-gather-xcbht_3c08d2a9-92d6-4587-b813-d098e010a208/copy/0.log" Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.471365 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.610812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output\") pod \"3c08d2a9-92d6-4587-b813-d098e010a208\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.610861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjtlq\" (UniqueName: \"kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq\") pod \"3c08d2a9-92d6-4587-b813-d098e010a208\" (UID: \"3c08d2a9-92d6-4587-b813-d098e010a208\") " Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.783903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3c08d2a9-92d6-4587-b813-d098e010a208" (UID: "3c08d2a9-92d6-4587-b813-d098e010a208"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:29:25 crc kubenswrapper[4749]: I0225 08:29:25.814400 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c08d2a9-92d6-4587-b813-d098e010a208-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.109949 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq" (OuterVolumeSpecName: "kube-api-access-vjtlq") pod "3c08d2a9-92d6-4587-b813-d098e010a208" (UID: "3c08d2a9-92d6-4587-b813-d098e010a208"). InnerVolumeSpecName "kube-api-access-vjtlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.119764 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjtlq\" (UniqueName: \"kubernetes.io/projected/3c08d2a9-92d6-4587-b813-d098e010a208-kube-api-access-vjtlq\") on node \"crc\" DevicePath \"\"" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.215681 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4j5k9_must-gather-xcbht_3c08d2a9-92d6-4587-b813-d098e010a208/copy/0.log" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.216178 4749 scope.go:117] "RemoveContainer" containerID="70559ef78ba644fc16a321373a6bba4acc2a10ae4261bdde82ac5896e8519f58" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.216247 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4j5k9/must-gather-xcbht" Feb 25 08:29:26 crc kubenswrapper[4749]: I0225 08:29:26.245892 4749 scope.go:117] "RemoveContainer" containerID="34b8e98960c51106f67a9a7b0fcf614e4fdacad206435a10043510c1f503fa8a" Feb 25 08:29:27 crc kubenswrapper[4749]: I0225 08:29:27.335365 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" path="/var/lib/kubelet/pods/3c08d2a9-92d6-4587-b813-d098e010a208/volumes" Feb 25 08:29:55 crc kubenswrapper[4749]: I0225 08:29:55.092585 4749 scope.go:117] "RemoveContainer" containerID="42959be7dd88a9e6f6a244938d551be5fcf56c6a0aefe51a9fe4ecf93859f1f1" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.164403 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533470-5pqw9"] Feb 25 08:30:00 crc kubenswrapper[4749]: E0225 08:30:00.166986 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="gather" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.167330 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="gather" Feb 25 08:30:00 crc kubenswrapper[4749]: E0225 08:30:00.167487 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="copy" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.167647 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="copy" Feb 25 08:30:00 crc kubenswrapper[4749]: E0225 08:30:00.167791 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97bc2ed9-8f20-4e8b-9102-e4654e698d28" containerName="oc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.167915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="97bc2ed9-8f20-4e8b-9102-e4654e698d28" containerName="oc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.168358 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="copy" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.168531 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c08d2a9-92d6-4587-b813-d098e010a208" containerName="gather" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.168724 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="97bc2ed9-8f20-4e8b-9102-e4654e698d28" containerName="oc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.169951 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.174411 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.174589 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.174782 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.178601 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc"] Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.180071 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.182847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.183004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.189309 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc"] Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.199382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533470-5pqw9"] Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.232450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.232510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brbqw\" (UniqueName: \"kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.232960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.233039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4xz6\" (UniqueName: \"kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6\") pod \"auto-csr-approver-29533470-5pqw9\" (UID: \"5af586de-45fb-43e5-9753-52e5c4abf374\") " pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.335114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.335156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brbqw\" (UniqueName: \"kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.335196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.335238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4xz6\" (UniqueName: \"kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6\") pod \"auto-csr-approver-29533470-5pqw9\" (UID: \"5af586de-45fb-43e5-9753-52e5c4abf374\") " pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.337158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.346512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.359833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brbqw\" (UniqueName: \"kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw\") pod \"collect-profiles-29533470-z45dc\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.362705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4xz6\" (UniqueName: \"kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6\") pod \"auto-csr-approver-29533470-5pqw9\" (UID: \"5af586de-45fb-43e5-9753-52e5c4abf374\") " pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.511038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:00 crc kubenswrapper[4749]: I0225 08:30:00.535864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.075606 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533470-5pqw9"] Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.157378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc"] Feb 25 08:30:01 crc kubenswrapper[4749]: W0225 08:30:01.164677 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f25c0a_4b1e_4b35_9bb9_25e4e6e4c9bf.slice/crio-a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea WatchSource:0}: Error finding container a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea: Status 404 returned error can't find the container with id a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.616147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" event={"ID":"5af586de-45fb-43e5-9753-52e5c4abf374","Type":"ContainerStarted","Data":"9ed8085807281b90a8a2281392890c9b03106c5f7617febb6af3ba8177381b44"} Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.620244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" event={"ID":"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf","Type":"ContainerStarted","Data":"dce698d1e8287fc1cf59d67aaa9c79ba97bdebd16e69ce01035926d8cc7b6106"} Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.620297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" event={"ID":"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf","Type":"ContainerStarted","Data":"a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea"} Feb 25 08:30:01 crc kubenswrapper[4749]: I0225 08:30:01.654862 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" podStartSLOduration=1.654837819 podStartE2EDuration="1.654837819s" podCreationTimestamp="2026-02-25 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-25 08:30:01.64293391 +0000 UTC m=+4355.004759970" watchObservedRunningTime="2026-02-25 08:30:01.654837819 +0000 UTC m=+4355.016663849" Feb 25 08:30:02 crc kubenswrapper[4749]: I0225 08:30:02.635152 4749 generic.go:334] "Generic (PLEG): container finished" podID="98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" containerID="dce698d1e8287fc1cf59d67aaa9c79ba97bdebd16e69ce01035926d8cc7b6106" exitCode=0 Feb 25 08:30:02 crc kubenswrapper[4749]: I0225 08:30:02.635231 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" event={"ID":"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf","Type":"ContainerDied","Data":"dce698d1e8287fc1cf59d67aaa9c79ba97bdebd16e69ce01035926d8cc7b6106"} Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.247348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.327004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume\") pod \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.327094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brbqw\" (UniqueName: \"kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw\") pod \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.327141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume\") pod \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\" (UID: \"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf\") " Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.330567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" (UID: "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.338832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw" (OuterVolumeSpecName: "kube-api-access-brbqw") pod "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" (UID: "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf"). InnerVolumeSpecName "kube-api-access-brbqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.338816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" (UID: "98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.429439 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brbqw\" (UniqueName: \"kubernetes.io/projected/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-kube-api-access-brbqw\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.430149 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.430318 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.659563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" event={"ID":"98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf","Type":"ContainerDied","Data":"a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea"} Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.659615 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a964121c9da79b49055e6be7d4bd731bb5b0e520ed6f2f53aed8db593f2362ea" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.659731 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29533470-z45dc" Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.661809 4749 generic.go:334] "Generic (PLEG): container finished" podID="5af586de-45fb-43e5-9753-52e5c4abf374" containerID="eda0cbc69bc696665a2bfbd52fae25b453f844dcddd5b75bbfd0d23c6b3fb25f" exitCode=0 Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.661834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" event={"ID":"5af586de-45fb-43e5-9753-52e5c4abf374","Type":"ContainerDied","Data":"eda0cbc69bc696665a2bfbd52fae25b453f844dcddd5b75bbfd0d23c6b3fb25f"} Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.756990 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r"] Feb 25 08:30:04 crc kubenswrapper[4749]: I0225 08:30:04.767063 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29533425-bw74r"] Feb 25 08:30:05 crc kubenswrapper[4749]: I0225 08:30:05.342875 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f99f78-940b-4c5e-b3f8-5d43fdc94340" path="/var/lib/kubelet/pods/81f99f78-940b-4c5e-b3f8-5d43fdc94340/volumes" Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.631633 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.681779 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4xz6\" (UniqueName: \"kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6\") pod \"5af586de-45fb-43e5-9753-52e5c4abf374\" (UID: \"5af586de-45fb-43e5-9753-52e5c4abf374\") " Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.696483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6" (OuterVolumeSpecName: "kube-api-access-b4xz6") pod "5af586de-45fb-43e5-9753-52e5c4abf374" (UID: "5af586de-45fb-43e5-9753-52e5c4abf374"). InnerVolumeSpecName "kube-api-access-b4xz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.706342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" event={"ID":"5af586de-45fb-43e5-9753-52e5c4abf374","Type":"ContainerDied","Data":"9ed8085807281b90a8a2281392890c9b03106c5f7617febb6af3ba8177381b44"} Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.706388 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed8085807281b90a8a2281392890c9b03106c5f7617febb6af3ba8177381b44" Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.706457 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533470-5pqw9" Feb 25 08:30:06 crc kubenswrapper[4749]: I0225 08:30:06.793278 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4xz6\" (UniqueName: \"kubernetes.io/projected/5af586de-45fb-43e5-9753-52e5c4abf374-kube-api-access-b4xz6\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:07 crc kubenswrapper[4749]: I0225 08:30:07.724496 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533464-9gqjf"] Feb 25 08:30:07 crc kubenswrapper[4749]: I0225 08:30:07.737273 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533464-9gqjf"] Feb 25 08:30:09 crc kubenswrapper[4749]: I0225 08:30:09.342164 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251b11d2-54f9-4aad-8ab8-e9f984c331ef" path="/var/lib/kubelet/pods/251b11d2-54f9-4aad-8ab8-e9f984c331ef/volumes" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.337057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:24 crc kubenswrapper[4749]: E0225 08:30:24.338269 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" containerName="collect-profiles" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.338291 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" containerName="collect-profiles" Feb 25 08:30:24 crc kubenswrapper[4749]: E0225 08:30:24.338326 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af586de-45fb-43e5-9753-52e5c4abf374" containerName="oc" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.338339 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af586de-45fb-43e5-9753-52e5c4abf374" containerName="oc" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.338791 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af586de-45fb-43e5-9753-52e5c4abf374" containerName="oc" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.338834 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f25c0a-4b1e-4b35-9bb9-25e4e6e4c9bf" containerName="collect-profiles" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.341208 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.348927 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.495817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.496157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zqp\" (UniqueName: \"kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.496232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.598353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.598858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.599125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.599298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zqp\" (UniqueName: \"kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.599418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.619908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zqp\" (UniqueName: \"kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp\") pod \"community-operators-2rzzw\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:24 crc kubenswrapper[4749]: I0225 08:30:24.675010 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:25 crc kubenswrapper[4749]: I0225 08:30:25.176194 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:25 crc kubenswrapper[4749]: I0225 08:30:25.940502 4749 generic.go:334] "Generic (PLEG): container finished" podID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerID="952218d0b13d7af5457e3ee12d16c206b50a3673416d73941cbcc18449df53b8" exitCode=0 Feb 25 08:30:25 crc kubenswrapper[4749]: I0225 08:30:25.940568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerDied","Data":"952218d0b13d7af5457e3ee12d16c206b50a3673416d73941cbcc18449df53b8"} Feb 25 08:30:25 crc kubenswrapper[4749]: I0225 08:30:25.940964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerStarted","Data":"4d49ba4f63c6428ba4354d90978f09397afac3c87d220416f9c330510e9c268c"} Feb 25 08:30:26 crc kubenswrapper[4749]: I0225 08:30:26.956705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerStarted","Data":"020e6e8864c09e932e3ccd1f7218926296d5f0834727e88d9e18a8017b54f789"} Feb 25 08:30:28 crc kubenswrapper[4749]: I0225 08:30:28.987283 4749 generic.go:334] "Generic (PLEG): container finished" podID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerID="020e6e8864c09e932e3ccd1f7218926296d5f0834727e88d9e18a8017b54f789" exitCode=0 Feb 25 08:30:28 crc kubenswrapper[4749]: I0225 08:30:28.987387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerDied","Data":"020e6e8864c09e932e3ccd1f7218926296d5f0834727e88d9e18a8017b54f789"} Feb 25 08:30:30 crc kubenswrapper[4749]: I0225 08:30:30.000194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerStarted","Data":"2deb63321e62fc0a68850ba9d609ffe2f53aaafdec1ad2fe9a6ad6b53f6071a2"} Feb 25 08:30:34 crc kubenswrapper[4749]: I0225 08:30:34.675426 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:34 crc kubenswrapper[4749]: I0225 08:30:34.676109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:35 crc kubenswrapper[4749]: I0225 08:30:35.415035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:35 crc kubenswrapper[4749]: I0225 08:30:35.470078 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2rzzw" podStartSLOduration=7.940256004 podStartE2EDuration="11.470052573s" podCreationTimestamp="2026-02-25 08:30:24 +0000 UTC" firstStartedPulling="2026-02-25 08:30:25.942560202 +0000 UTC m=+4379.304386252" lastFinishedPulling="2026-02-25 08:30:29.472356771 +0000 UTC m=+4382.834182821" observedRunningTime="2026-02-25 08:30:30.029483037 +0000 UTC m=+4383.391309137" watchObservedRunningTime="2026-02-25 08:30:35.470052573 +0000 UTC m=+4388.831878633" Feb 25 08:30:35 crc kubenswrapper[4749]: I0225 08:30:35.483358 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:35 crc kubenswrapper[4749]: I0225 08:30:35.686927 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:37 crc kubenswrapper[4749]: I0225 08:30:37.079815 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2rzzw" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="registry-server" containerID="cri-o://2deb63321e62fc0a68850ba9d609ffe2f53aaafdec1ad2fe9a6ad6b53f6071a2" gracePeriod=2 Feb 25 08:30:37 crc kubenswrapper[4749]: E0225 08:30:37.841631 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0dbad93_6aba_4423_ac27_3fe32e5825d3.slice/crio-2deb63321e62fc0a68850ba9d609ffe2f53aaafdec1ad2fe9a6ad6b53f6071a2.scope\": RecentStats: unable to find data in memory cache]" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.094356 4749 generic.go:334] "Generic (PLEG): container finished" podID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerID="2deb63321e62fc0a68850ba9d609ffe2f53aaafdec1ad2fe9a6ad6b53f6071a2" exitCode=0 Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.094464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerDied","Data":"2deb63321e62fc0a68850ba9d609ffe2f53aaafdec1ad2fe9a6ad6b53f6071a2"} Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.094727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2rzzw" event={"ID":"b0dbad93-6aba-4423-ac27-3fe32e5825d3","Type":"ContainerDied","Data":"4d49ba4f63c6428ba4354d90978f09397afac3c87d220416f9c330510e9c268c"} Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.094744 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d49ba4f63c6428ba4354d90978f09397afac3c87d220416f9c330510e9c268c" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.113228 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.212564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities\") pod \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.212648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content\") pod \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.212746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4zqp\" (UniqueName: \"kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp\") pod \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\" (UID: \"b0dbad93-6aba-4423-ac27-3fe32e5825d3\") " Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.213359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities" (OuterVolumeSpecName: "utilities") pod "b0dbad93-6aba-4423-ac27-3fe32e5825d3" (UID: "b0dbad93-6aba-4423-ac27-3fe32e5825d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.220842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp" (OuterVolumeSpecName: "kube-api-access-l4zqp") pod "b0dbad93-6aba-4423-ac27-3fe32e5825d3" (UID: "b0dbad93-6aba-4423-ac27-3fe32e5825d3"). InnerVolumeSpecName "kube-api-access-l4zqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.262914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0dbad93-6aba-4423-ac27-3fe32e5825d3" (UID: "b0dbad93-6aba-4423-ac27-3fe32e5825d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.315000 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4zqp\" (UniqueName: \"kubernetes.io/projected/b0dbad93-6aba-4423-ac27-3fe32e5825d3-kube-api-access-l4zqp\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.315085 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:38 crc kubenswrapper[4749]: I0225 08:30:38.315099 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0dbad93-6aba-4423-ac27-3fe32e5825d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 25 08:30:39 crc kubenswrapper[4749]: I0225 08:30:39.105727 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2rzzw" Feb 25 08:30:39 crc kubenswrapper[4749]: I0225 08:30:39.159255 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:39 crc kubenswrapper[4749]: I0225 08:30:39.173532 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2rzzw"] Feb 25 08:30:39 crc kubenswrapper[4749]: I0225 08:30:39.341146 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" path="/var/lib/kubelet/pods/b0dbad93-6aba-4423-ac27-3fe32e5825d3/volumes" Feb 25 08:30:55 crc kubenswrapper[4749]: I0225 08:30:55.221164 4749 scope.go:117] "RemoveContainer" containerID="371a26222bf108fc5ec0f7373d3dc72b50284939bd04341520608005b5fd6743" Feb 25 08:30:55 crc kubenswrapper[4749]: I0225 08:30:55.266796 4749 scope.go:117] "RemoveContainer" containerID="a7ce5101122fc5ed4d68321b33b4e8f99201a5157b61ef122dd8b93332583db9" Feb 25 08:31:21 crc kubenswrapper[4749]: I0225 08:31:21.672122 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:31:21 crc kubenswrapper[4749]: I0225 08:31:21.672639 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:31:51 crc kubenswrapper[4749]: I0225 08:31:51.671558 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:31:51 crc kubenswrapper[4749]: I0225 08:31:51.672195 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.145685 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29533472-bb6vj"] Feb 25 08:32:00 crc kubenswrapper[4749]: E0225 08:32:00.146479 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="extract-content" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.146490 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="extract-content" Feb 25 08:32:00 crc kubenswrapper[4749]: E0225 08:32:00.146499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="extract-utilities" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.146505 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="extract-utilities" Feb 25 08:32:00 crc kubenswrapper[4749]: E0225 08:32:00.146518 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="registry-server" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.146523 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="registry-server" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.146739 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0dbad93-6aba-4423-ac27-3fe32e5825d3" containerName="registry-server" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.147345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.154260 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533472-bb6vj"] Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.157059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-bsvvp" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.157524 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.158229 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.268807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95p2k\" (UniqueName: \"kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k\") pod \"auto-csr-approver-29533472-bb6vj\" (UID: \"ed3078c0-1992-4ab3-ba15-8a86c0928f50\") " pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.370089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95p2k\" (UniqueName: \"kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k\") pod \"auto-csr-approver-29533472-bb6vj\" (UID: \"ed3078c0-1992-4ab3-ba15-8a86c0928f50\") " pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.393934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95p2k\" (UniqueName: \"kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k\") pod \"auto-csr-approver-29533472-bb6vj\" (UID: \"ed3078c0-1992-4ab3-ba15-8a86c0928f50\") " pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.483799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.966221 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29533472-bb6vj"] Feb 25 08:32:00 crc kubenswrapper[4749]: W0225 08:32:00.970276 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded3078c0_1992_4ab3_ba15_8a86c0928f50.slice/crio-b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d WatchSource:0}: Error finding container b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d: Status 404 returned error can't find the container with id b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d Feb 25 08:32:00 crc kubenswrapper[4749]: I0225 08:32:00.973650 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 25 08:32:01 crc kubenswrapper[4749]: I0225 08:32:01.068103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" event={"ID":"ed3078c0-1992-4ab3-ba15-8a86c0928f50","Type":"ContainerStarted","Data":"b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d"} Feb 25 08:32:03 crc kubenswrapper[4749]: I0225 08:32:03.100258 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed3078c0-1992-4ab3-ba15-8a86c0928f50" containerID="6191ba9dfce3b5626e78fb4a7c521c36436106774b205fa96ac874b0fb947ffe" exitCode=0 Feb 25 08:32:03 crc kubenswrapper[4749]: I0225 08:32:03.100355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" event={"ID":"ed3078c0-1992-4ab3-ba15-8a86c0928f50","Type":"ContainerDied","Data":"6191ba9dfce3b5626e78fb4a7c521c36436106774b205fa96ac874b0fb947ffe"} Feb 25 08:32:04 crc kubenswrapper[4749]: I0225 08:32:04.542863 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:04 crc kubenswrapper[4749]: I0225 08:32:04.664294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95p2k\" (UniqueName: \"kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k\") pod \"ed3078c0-1992-4ab3-ba15-8a86c0928f50\" (UID: \"ed3078c0-1992-4ab3-ba15-8a86c0928f50\") " Feb 25 08:32:04 crc kubenswrapper[4749]: I0225 08:32:04.684066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k" (OuterVolumeSpecName: "kube-api-access-95p2k") pod "ed3078c0-1992-4ab3-ba15-8a86c0928f50" (UID: "ed3078c0-1992-4ab3-ba15-8a86c0928f50"). InnerVolumeSpecName "kube-api-access-95p2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 25 08:32:04 crc kubenswrapper[4749]: I0225 08:32:04.767578 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95p2k\" (UniqueName: \"kubernetes.io/projected/ed3078c0-1992-4ab3-ba15-8a86c0928f50-kube-api-access-95p2k\") on node \"crc\" DevicePath \"\"" Feb 25 08:32:05 crc kubenswrapper[4749]: I0225 08:32:05.123525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" event={"ID":"ed3078c0-1992-4ab3-ba15-8a86c0928f50","Type":"ContainerDied","Data":"b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d"} Feb 25 08:32:05 crc kubenswrapper[4749]: I0225 08:32:05.123581 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b23f846d4fbe5f25b4ff0d9a34aa82e169d946ef7c9129fc56258f164cd8703d" Feb 25 08:32:05 crc kubenswrapper[4749]: I0225 08:32:05.123727 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29533472-bb6vj" Feb 25 08:32:05 crc kubenswrapper[4749]: I0225 08:32:05.684007 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29533466-brbfw"] Feb 25 08:32:05 crc kubenswrapper[4749]: I0225 08:32:05.701081 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29533466-brbfw"] Feb 25 08:32:07 crc kubenswrapper[4749]: I0225 08:32:07.339985 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a76cbdb-bb28-447d-a3ab-954bbe80dcbc" path="/var/lib/kubelet/pods/0a76cbdb-bb28-447d-a3ab-954bbe80dcbc/volumes" Feb 25 08:32:21 crc kubenswrapper[4749]: I0225 08:32:21.672155 4749 patch_prober.go:28] interesting pod/machine-config-daemon-ljd89 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 25 08:32:21 crc kubenswrapper[4749]: I0225 08:32:21.672895 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 25 08:32:21 crc kubenswrapper[4749]: I0225 08:32:21.672958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" Feb 25 08:32:21 crc kubenswrapper[4749]: I0225 08:32:21.674121 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174"} pod="openshift-machine-config-operator/machine-config-daemon-ljd89" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 25 08:32:21 crc kubenswrapper[4749]: I0225 08:32:21.674224 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" containerName="machine-config-daemon" containerID="cri-o://d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" gracePeriod=600 Feb 25 08:32:21 crc kubenswrapper[4749]: E0225 08:32:21.818210 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:32:22 crc kubenswrapper[4749]: I0225 08:32:22.302974 4749 generic.go:334] "Generic (PLEG): container finished" podID="1183771e-2d52-421f-8c26-0aaff531934a" containerID="d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" exitCode=0 Feb 25 08:32:22 crc kubenswrapper[4749]: I0225 08:32:22.303017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" event={"ID":"1183771e-2d52-421f-8c26-0aaff531934a","Type":"ContainerDied","Data":"d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174"} Feb 25 08:32:22 crc kubenswrapper[4749]: I0225 08:32:22.303049 4749 scope.go:117] "RemoveContainer" containerID="c68274139adddc75b50f0b94d3deccee5e8a60696ed64cec9ee84433451e6187" Feb 25 08:32:22 crc kubenswrapper[4749]: I0225 08:32:22.303851 4749 scope.go:117] "RemoveContainer" containerID="d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" Feb 25 08:32:22 crc kubenswrapper[4749]: E0225 08:32:22.304341 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:32:36 crc kubenswrapper[4749]: I0225 08:32:36.323293 4749 scope.go:117] "RemoveContainer" containerID="d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" Feb 25 08:32:36 crc kubenswrapper[4749]: E0225 08:32:36.325821 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:32:48 crc kubenswrapper[4749]: I0225 08:32:48.322855 4749 scope.go:117] "RemoveContainer" containerID="d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" Feb 25 08:32:48 crc kubenswrapper[4749]: E0225 08:32:48.323616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a" Feb 25 08:32:55 crc kubenswrapper[4749]: I0225 08:32:55.448120 4749 scope.go:117] "RemoveContainer" containerID="18304353c3ef96d37df16f0ab893beac21c2e61c98e603636eec1408cabf946a" Feb 25 08:33:01 crc kubenswrapper[4749]: I0225 08:33:01.323325 4749 scope.go:117] "RemoveContainer" containerID="d4da15d4c749e6dc47893dc6aa69ca897480754e7c4df56121c2fadc3fe16174" Feb 25 08:33:01 crc kubenswrapper[4749]: E0225 08:33:01.324560 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ljd89_openshift-machine-config-operator(1183771e-2d52-421f-8c26-0aaff531934a)\"" pod="openshift-machine-config-operator/machine-config-daemon-ljd89" podUID="1183771e-2d52-421f-8c26-0aaff531934a"